Jobs
Interviews

9742 Iot Jobs - Page 14

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 years

0 Lacs

New Delhi, Delhi, India

On-site

Location: Okhla - New Delhi Experience: 5+ Years Industry: Telecommunications / Technology Department: Sales / Business Development Reports To: CSSO Job Summary: We are seeking a dynamic and results-driven Telecom Sales Manager with 5+ years of experience to lead business development and sales initiatives across a range of telecom solutions, including 4G/5G networks, IoT, ISP services, BSS/OSS platforms, and value-added services like RBT and VRBT . The ideal candidate will possess deep domain knowledge, strong customer engagement skills, and a proven track record in achieving sales targets in the telecom sector. Key Responsibilities: Drive end-to-end sales cycles for telecom solutions including: 4G/5G Network Solutions Internet of Things (IoT) Platforms & Devices Internet Services (ISP) BSS/OSS Solutions Ring Back Tone (RBT) and Video RBT (VRBT) Services Develop and execute strategic sales plans to expand the customer base and achieve sales targets. Build and manage a robust sales pipeline through direct customer engagement, partner collaboration, and channel development. Identify and engage with telcos, ISPs, MVNOs, and enterprise clients to understand needs and propose appropriate solutions. Conduct presentations, product demos, and proof-of-concept discussions with potential customers. Collaborate with technical pre-sales and solution architects to develop customized proposals and solutions. Monitor industry trends and competitive landscape to identify new business opportunities and inform product positioning. Participate in industry events, trade shows, and conferences to promote company offerings. Requirements: Education: Bachelor’s Degree in Engineering, Telecommunications, IT, Business Administration or related field. MBA is a plus. Experience: 5 to 7 years of experience in telecom sales or business development roles. Proven experience with selling 4G/5G solutions, IoT, ISP services, BSS/OSS platforms. Knowledge and experience in VAS, particularly RBT and VRBT, is highly desirable. Skills: Strong understanding of telecom technologies and services. Excellent communication, negotiation, and presentation skills. Ability to work with cross-functional teams and engage C-level executives. Sales forecasting, CRM tools (e.g., Salesforce), and reporting experience. Goal-oriented, self-motivated, and able to thrive in a fast-paced environment. Preferred Certifications: Sales or technical certifications in telecom, cloud, or IoT domains. Familiarity with telecom regulatory and compliance frameworks is a plus. Compensation: Competitive base salary + performance-based incentives. Benefits include medical insurance, travel allowance, and professional development opportunities

Posted 4 days ago

Apply

0.0 - 2.0 years

3 - 5 Lacs

Gurugram, Haryana

On-site

Job Title: Firmware Engineer (Hardware Integration) Location: Sector 62, Gurugram, Haryana 122101 Experience Level: Required Experience 1-2 years Apply Now - hr@enlog.co.in About Us: At Enlog, we are redefining energy management with innovative technology that helps businesses and communities reduce energy waste and embrace sustainable practices. As a vibrant startup, we offer a dynamic work culture, meaningful learning experiences, and the opportunity to contribute to a greener planet. About Role: We are seeking a skilled and motivated Firmware Engineer to join our hardware engineering team. You will be responsible for developing, testing, and optimizing firmware for custom hardware platforms, ensuring reliable system performance across embedded devices. This role requires strong hands-on experience with embedded C/C++, microcontroller platforms (e.g., ARM, STM32, ESP), and direct interaction with hardware peripherals. You’ll work closely with hardware engineers and product teams to bring devices from prototype to production. Responsibilities: A. Firmware Architecture & Development Architect, write, and optimize firmware for ESP32 (C3, C6, S3) and STM32-based boards Develop real-time sensor drivers for energy monitoring ICs (e.g., HLW8012, HT7017, BL0937) Build a modular firmware stack supporting: Mesh communication MQTT publishing OTA updates Offline fallback modes Implement fail-safe logic including: i. NVS/Flash-based configuration ii. Power-loss recovery routines iii. Watchdog/reset handlers B. Communication & Protocol Stack Implement and debug custom mesh protocols over ESP-NOW / 802.11 (for Enmate). Maintain ultra-lightweight MQTT stack, free from heavy third-party dependencies. Optimize low-level comms: UART, SPI, I2C, especially under interrupt-driven loads. Optional: Add support for TLS and secure provisioning if needed. C. Device Management & OTA Build and maintain OTA systems using ESP-IDF / STM32 HAL with: i. Rollback support ii. Firmware integrity validation Manage config persistence via NVS, SPIFFS, or Flash FS Implement local fallback flows like: i. Hotspot mode for setup ii. IP-based configuration access iii. Config sync from cloud/mesh root D. Testing & Validation Develop test harnesses for: i. Unit-level validation of pin states, sensor reads, publishing logic ii. Stress testing relays, memory safety, power stability Support QA during: i. EMC compliance ii. Field deployment validation iii. Regression suite development E. Collaboration & Mentorship Work with hardware engineers for: i. Pin muxing ii. Layout constraints iii. EMI-safe firmware behaviour Coordinate with backend and mobile teams for: i. Payload formatting ii. Clock sync logic iii. Retry/fallback design iv. Mentor junior engineers on structured firmware design, debugging tools, and release readiness Tools & Ecosystem Development: ESP-IDF, STM32Cube, PlatformIO Debugging: JTAG, GDB, Logic Analyzers DevOps: GitHub, Jira, OTA build + CI pipelines Editors: VSCode, CLion, or any preferred tool Requirements Bachelor’s degree in Electronics, Electrical, Computer Engineering, or related field. 2+ years of hands-on experience in firmware development for embedded hardware. Proficient in Embedded C/C++ programming. Experience with microcontrollers (e.g., STM32, ESP32, PIC, ARM Cortex). Strong understanding of digital electronics, schematics, and hardware debugging tools oscilloscopes, logic analysers. Familiarity with communication protocols: I2C, SPI, UART, CAN, Modbus. Ability to work with version control tools like Git and CI workflows. Nice to Have Experience with RTOS Familiarity with firmware-over-the-air (FOTA) updates and bootloader design. Python scripting for testing or automation. Exposure to IoT stacks (BLE, Wi-Fi, MQTT, etc.). Job Types: Full-time, Permanent Pay: ₹307,512.32 - ₹500,000.00 per year Benefits: Leave encashment Paid sick time Provident Fund Work Location: In person Speak with the employer +91 7428981477

Posted 4 days ago

Apply

0 years

0 Lacs

India

Remote

#Job ID: PUN-IN/SMM250801007IN | Digital Marketing Intern - Social Media Marketing (Unpaid) Internship Overview: This internship is for the Public relations department of PMN Patralok - a division of Punama Innovation. We believe in not only quality writing but also in quality expressions by any means. Design, images, shorts, reels and other graphics are some of the best mediums to reach our audience quickly. We are looking for people who can express their thought process or vibes through short video clips or posters and can convert a journalist’s post into a social media post. Here at our organisation, we believe in learning, we believe in togetherness, and we believe in guiding and mentoring our people towards their progress and well-being. Here we give much time to each other in training, guidance and support so that our values and standards can be set high. We invite passionate people, who are ready to learn, take challenges, have compassion and should be able to devote more than 4 - 5 hours on a daily basis (5 days a week, roster based). You get plenty of week offs, exam leaves and support! Applications are invited for: Digital Marketing Intern - SMM (Unpaid) Work includes: Converting News articles shared by the Journalism team into Social Media short videos (like Instagram Reels and Youtube Shorts). Designing high-quality Social Media Creatives. Ensuring Quality and timely completion of the projects. Advising best practices and optimizations. Working in Teams with Journalists and Marketing. Having attention to detail. Skills Required: Knowledge of Video Editing Software (Adobe Premiere Pro, Canva, Adobe After Effects or any other video editing software) Basics of Motion Graphics Editing Techniques Attention to detail Problem Solving Creativity Portfolio showcasing video editing skills Qualifications: Bachelor's degree / pursuing or higher in related field People already working and looking out for a change in career Women who wants to restart their career after a family break and meets necessary academic and other qualifications mentioned IMPORTANT (Sample Prescribed Format): Writing / Design or any other Work samples and shift timings needed to proceed with the Interview Send your work samples and preferred shift timings with below subject line to parul.sharma@punama.in Email Subject FORMAT: #Job ID: PUN-IN/SMM250801007IN| Digital Marketing Intern - Social Media Marketing | Example : #Job ID: PUN-IN/SMM250801007IN | Digital Marketing Intern - Social Media Marketing | Ritesh Kumar Perks: Certificate on completion of the Internship Flexible Working Hours Great Learning Opportunity – More than training, we give you challenges to learn with guidance and support Great Mentorship Work from Home opportunity Every month, there will be a mandatory review of the Intern’s work efforts. Based on the review, the Internship will be either extended or terminated. Prerequisites for internship extension: Seriousness - as seen in work performance Learnability - How much the candidate is willing and trying to learn Understandability - How much the candidate understands the situation/work. Even if they do not, how hard they are trying to get understood. Responsibility – Although there is not much about shifty timings, how responsible the candidate is in delivering the work on time. Hiring Procedure: Candidate Applies via LinkedIn Candidates apply online with required samples and Resume HR reviews applications for initial suitability. Applications without any sample or with samples that are not in prescribed format are rejected without any intimation or response to the candidates. Shortlisted candidates receives a confirmation mail and JD from the TA Incharge on email Basic HR Telephonic discussion After email, shortlisted candidates will get a phone call from HR for an initial discussion. Assessment (Objective Questions) and F2F Video Interview on live Google Meet call Selected candidates take a skills-based online test while sharing their screen on Google Meet or on an automated assessment software (anyone applicable) F2F Interview in the same Meet Call or in a separately fixed meeting Results will be declared by the next working day about final result or any extra further step Company Overview: We are hiring for a newly established News and Media vertical of Punama Innovation, called as PMN Patralok and was launched in 2023. Punama Innovation is an IT based Organisation, dealing with Software and Embedded Systems based services and Manufacturing. We work on Cloud solutions, Cloud security, Embedded Systems & IoT development, Firmware development, customized Embedded manufacturing etc. PMN Patralok is a News portal, a team of Journalists which likes to explore, understand, uncover and present the information of whatever is happening around us, whether local or international, scientific or artistic, natural or human-developed. We like to present the news in a simplistic manner, with easy and simple understandable language. At start, we are going to deliver our content in Hindi and English, and our work domain includes Geo Politics, International Relations, Crime, Politics, Sports, Entertainment, Lifestyle, Health, Technology, Gadgets, Science, Culture etc. For any further queries, reach out to: TA Incharge: Parul Sharma Mobile: +91 98119 60735 Email: parul.sharma@punama.in

Posted 4 days ago

Apply

5.0 years

0 Lacs

Kerala, India

Remote

WalTech International Inc. A fast-growing US-based company building smart RV control systems, IoT devices, and AI-powered applications. Our products are used across North America by RV manufacturers and end customers. About the Senior Role: We are NOT looking for a task finisher or a developer who only works on assigned tickets. We are hiring a Lead — someone who can see the big picture, make the right technical decisions, and lead our product forward with ownership and vision. We are looking for a Senior Full Stack Developer who has a proven track record in building Android applications, backend systems, and deploying services on the cloud. This is a key role for our growing team , where you will be expected to: Own the complete development lifecycle of mobile and backend features Collaborate directly with the founders and engineering team Make architectural decisions for Android, backend, and middleware systems Build scalable systems that connect hardware, cloud, and mobile We want someone who can take initiative, solve problems independently , and help build our next-generation smart control systems. Key Responsibilities: Android Development: Lead Android app development using Kotlin + Jetpack Compose Implement MVVM architecture , data handling, and clean UI practices Integrate REST APIs, manage local storage with Room, and optimize app performance Work on MQTT communication with hardware-connected devices Backend API & Middleware Development: Build scalable backend APIs using Spring Boot (Java) with proper security/authentication Design and maintain Node.js middleware for real-time data processing Optimize API performance, manage data flow between app, cloud, and IoT devices Follow clean coding, documentation, and structured backend design Cloud Deployment & Integration: Deploy backend and middleware services on AWS EC2, Lambda, API Gateway Handle cloud deployment best practices, security, and scalability Manage system monitoring and basic DevOps tasks when required Leadership & Collaboration: Work directly with hardware, firmware, and app teams for seamless integration Lead by example — mentor junior developers if needed Participate in system design discussions and propose improvements Take ownership of assigned projects and deliver on time What We Expect You to Bring: 5+ years of Android development experience (Kotlin, MVVM, REST APIs) 3+ years of backend API development experience with Spring Boot (Java) Hands-on experience with Node.js for middleware systems AWS deployment experience — EC2, Lambda, API Gateway Exposure to MQTT or IoT device communication Strong problem-solving skills, leadership mindset, and initiative-taking attitude Good communication skills and remote work discipline Willingness to explore AI integration and smart system design What You’ll Get: Fully Remote Role with Flexible Working Hours Salary Range: As per Industry Standard Direct impact on international product lines and IoT solutions Freedom to innovate and grow your skills in AI, IoT, and cloud technologies Opportunity to grow with a fast-paced international team

Posted 4 days ago

Apply

5.0 - 8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Description GPP Database Link (https://cummins365.sharepoint.com/sites/CS38534/) Leads projects for design, development and maintenance of a data and analytics platform. Effectively and efficiently process, store and make data available to analysts and other consumers. Works with key business stakeholders, IT experts and subject-matter experts to plan, design and deliver optimal analytics and data science solutions. Works on one or many product teams at a time. Key Responsibilities Designs and automates deployment of our distributed system for ingesting and transforming data from various types of sources (relational, event-based, unstructured). Designs and implements framework to continuously monitor and troubleshoot data quality and data integrity issues. Implements data governance processes and methods for managing metadata, access, retention to data for internal and external users. Designs and provide guidance on building reliable, efficient, scalable and quality data pipelines with monitoring and alert mechanisms that combine a variety of sources using ETL/ELT tools or scripting languages. Designs and implements physical data models to define the database structure. Optimizing database performance through efficient indexing and table relationships. Participates in optimizing, testing, and troubleshooting of data pipelines. Designs, develops and operates large scale data storage and processing solutions using different distributed and cloud based platforms for storing data (e.g. Data Lakes, Hadoop, Hbase, Cassandra, MongoDB, Accumulo, DynamoDB, others). Uses innovative and modern tools, techniques and architectures to partially or completely automate the most-common, repeatable and tedious data preparation and integration tasks in order to minimize manual and error-prone processes and improve productivity. Assists with renovating the data management infrastructure to drive automation in data integration and management. Ensures the timeliness and success of critical analytics initiatives by using agile development technologies such as DevOps, Scrum, Kanban Coaches and develops less experienced team members. Responsibilities Competencies: System Requirements Engineering - Uses appropriate methods and tools to translate stakeholder needs into verifiable requirements to which designs are developed; establishes acceptance criteria for the system of interest through analysis, allocation and negotiation; tracks the status of requirements throughout the system lifecycle; assesses the impact of changes to system requirements on project scope, schedule, and resources; creates and maintains information linkages to related artifacts. Collaborates - Building partnerships and working collaboratively with others to meet shared objectives. Communicates effectively - Developing and delivering multi-mode communications that convey a clear understanding of the unique needs of different audiences. Customer focus - Building strong customer relationships and delivering customer-centric solutions. Decision quality - Making good and timely decisions that keep the organization moving forward. Data Extraction - Performs data extract-transform-load (ETL) activities from variety of sources and transforms them for consumption by various downstream applications and users using appropriate tools and technologies. Programming - Creates, writes and tests computer code, test scripts, and build scripts using algorithmic analysis and design, industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Quality Assurance Metrics - Applies the science of measurement to assess whether a solution meets its intended outcomes using the IT Operating Model (ITOM), including the SDLC standards, tools, metrics and key performance indicators, to deliver a quality product. Solution Documentation - Documents information and solution based on knowledge gained as part of product development activities; communicates to stakeholders with the goal of enabling improved productivity and effective knowledge transfer to others who were not originally part of the initial learning. Solution Validation Testing - Validates a configuration item change or solution using the Function's defined best practices, including the Systems Development Life Cycle (SDLC) standards, tools and metrics, to ensure that it works as designed and meets customer requirements. Data Quality - Identifies, understands and corrects flaws in data that supports effective information governance across operational business processes and decision making. Problem Solving - Solves problems and may mentor others on effective problem solving by using a systematic analysis process by leveraging industry standard methodologies to create problem traceability and protect the customer; determines the assignable cause; implements robust, data-based solutions; identifies the systemic root causes and ensures actions to prevent problem reoccurrence are implemented. Values differences - Recognizing the value that different perspectives and cultures bring to an organization. Education, Licenses, Certifications College, university, or equivalent degree in relevant technical discipline, or relevant equivalent experience required. This position may require licensing for compliance with export controls or sanctions regulations. Experience Intermediate experience in a relevant discipline area is required. Knowledge of the latest technologies and trends in data engineering are highly preferred and includes: 5-8 years of experience Familiarity analyzing complex business systems, industry requirements, and/or data regulations Background in processing and managing large data sets Design and development for a Big Data platform using open source and third-party tools SPARK, Scala/Java, Map-Reduce, Hive, Hbase, and Kafka or equivalent college coursework SQL query language Clustered compute cloud-based implementation experience Experience developing applications requiring large file movement for a Cloud-based environment and other data extraction tools and methods from a variety of sources Experience in building analytical solutions Intermediate Experiences In The Following Are Preferred Experience with IoT technology Experience in Agile software development Qualifications Work closely with business Product Owner to understand product vision. Play a key role across DBU Data & Analytics Power Cells to define, develop data pipelines for efficient data transport into Cummins Digital Core ( Azure DataLake, Snowflake). Collaborate closely with AAI Digital Core and AAI Solutions Architecture to ensure alignment of DBU project data pipeline design standards. Independently design, develop, test, implement complex data pipelines from transactional systems (ERP, CRM) to Datawarehouses, DataLake. Responsible for creation, maintenence and management of DBU Data & Analytics data engineering documentation and standard operating procedures (SOP). Take part in evaluation of new data tools, POCs and provide suggestions. Take full ownership of the developed data pipelines, providing ongoing support for enhancements and performance optimization. Proactively address and resolve issues that compromise data accuracy and usability. Preferred Skills Programming Languages: Proficiency in languages such as Python, Java, and/or Scala. Database Management: Expertise in SQL and NoSQL databases. Big Data Technologies: Experience with Hadoop, Spark, Kafka, and other big data frameworks. Cloud Services: Experience with Azure, Databricks and AWS cloud platforms. ETL Processes: Strong understanding of Extract, Transform, Load (ETL) processes. Data Replication: Working knowledge of replication technologies like Qlik Replicate is a plus API: Working knowledge of API to consume data from ERP, CRM

Posted 4 days ago

Apply

4.0 - 5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Description GPP Database Link (https://cummins365.sharepoint.com/sites/CS38534/) Supports, develops and maintains a data and analytics platform. Effectively and efficiently process, store and make data available to analysts and other consumers. Works with the Business and IT teams to understand the requirements to best leverage the technologies to enable agile data delivery at scale. Key Responsibilities Implements and automates deployment of our distributed system for ingesting and transforming data from various types of sources (relational, event-based, unstructured). Implements methods to continuously monitor and troubleshoot data quality and data integrity issues. Implements data governance processes and methods for managing metadata, access, retention to data for internal and external users. Develops reliable, efficient, scalable and quality data pipelines with monitoring and alert mechanisms that combine a variety of sources using ETL/ELT tools or scripting languages. Develops physical data models and implements data storage architectures as per design guidelines. Analyzes complex data elements and systems, data flow, dependencies, and relationships in order to contribute to conceptual physical and logical data models. Participates in testing and troubleshooting of data pipelines. Develops and operates large scale data storage and processing solutions using different distributed and cloud based platforms for storing data (e.g. Data Lakes, Hadoop, Hbase, Cassandra, MongoDB, Accumulo, DynamoDB, others). Uses agile development technologies, such as DevOps, Scrum, Kanban and continuous improvement cycle, for data driven application. Responsibilities Competencies: System Requirements Engineering - Uses appropriate methods and tools to translate stakeholder needs into verifiable requirements to which designs are developed; establishes acceptance criteria for the system of interest through analysis, allocation and negotiation; tracks the status of requirements throughout the system lifecycle; assesses the impact of changes to system requirements on project scope, schedule, and resources; creates and maintains information linkages to related artifacts. Collaborates - Building partnerships and working collaboratively with others to meet shared objectives. Communicates effectively - Developing and delivering multi-mode communications that convey a clear understanding of the unique needs of different audiences. Customer focus - Building strong customer relationships and delivering customer-centric solutions. Decision quality - Making good and timely decisions that keep the organization moving forward. Data Extraction - Performs data extract-transform-load (ETL) activities from variety of sources and transforms them for consumption by various downstream applications and users using appropriate tools and technologies. Programming - Creates, writes and tests computer code, test scripts, and build scripts using algorithmic analysis and design, industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Quality Assurance Metrics - Applies the science of measurement to assess whether a solution meets its intended outcomes using the IT Operating Model (ITOM), including the SDLC standards, tools, metrics and key performance indicators, to deliver a quality product. Solution Documentation - Documents information and solution based on knowledge gained as part of product development activities; communicates to stakeholders with the goal of enabling improved productivity and effective knowledge transfer to others who were not originally part of the initial learning. Solution Validation Testing - Validates a configuration item change or solution using the Function's defined best practices, including the Systems Development Life Cycle (SDLC) standards, tools and metrics, to ensure that it works as designed and meets customer requirements. Data Quality - Identifies, understands and corrects flaws in data that supports effective information governance across operational business processes and decision making. Problem Solving - Solves problems and may mentor others on effective problem solving by using a systematic analysis process by leveraging industry standard methodologies to create problem traceability and protect the customer; determines the assignable cause; implements robust, data-based solutions; identifies the systemic root causes and ensures actions to prevent problem reoccurrence are implemented. Values differences - Recognizing the value that different perspectives and cultures bring to an organization. Education, Licenses, Certifications College, university, or equivalent degree in relevant technical discipline, or relevant equivalent experience required. This position may require licensing for compliance with export controls or sanctions regulations. Experience 4-5 Years of experience. Relevant experience preferred such as working in a temporary student employment, intern, co-op, or other extracurricular team activities. Knowledge of the latest technologies in data engineering is highly preferred and includes: Exposure to Big Data open source SPARK, Scala/Java, Map-Reduce, Hive, Hbase, and Kafka or equivalent college coursework SQL query language Clustered compute cloud-based implementation experience Familiarity developing applications requiring large file movement for a Cloud-based environment Exposure to Agile software development Exposure to building analytical solutions Exposure to IoT technology Qualifications Work closely with business Product Owner to understand product vision. Participate in DBU Data & Analytics Power Cells to define, develop data pipelines for efficient data transport into Cummins Digital Core ( Azure DataLake, Snowflake). Collaborate closely with AAI Digital Core and AAI Solutions Architecture to ensure alignment of DBU project data pipeline design standards. Work under limited supervision to design, develop, test, implement complex data pipelines from transactional systems (ERP, CRM) to Datawarehouses, DataLake. Responsible for creation of DBU Data & Analytics data engineering documentation and standard operating procedures (SOP) with guidance and help from senior data engineers. Take part in evaluation of new data tools, POCs with guidance and help from senior data engineers. Take ownership of the developed data pipelines, providing ongoing support for enhancements and performance optimization under limited supervision. Assist to resolve issues that compromise data accuracy and usability. Programming Languages: Proficiency in languages such as Python, Java, and/or Scala. Database Management: Intermediate level expertise in SQL and NoSQL databases. Big Data Technologies: Experience with Hadoop, Spark, Kafka, and other big data frameworks. Cloud Services: Experience with Azure, Databricks and AWS cloud platforms. ETL Processes: Strong understanding of Extract, Transform, Load (ETL) processes. API: Working knowledge of API to consume data from ERP, CRM

Posted 4 days ago

Apply

0.0 - 6.0 years

0 Lacs

Chennai, Tamil Nadu

On-site

MS - Automotive & ManufacturingChennai Posted On 01 Aug 2025 End Date 30 Sep 2025 Required Experience 3 - 5 Years Basic Section No. Of Openings 1 Designation Senior Test Engineer Closing Date 30 Sep 2025 Organisational MainBU Quality Engineering Sub BU MS - Automotive & Manufacturing Country India Region India State Tamil Nadu City Chennai Working Location Chennai Client Location NA Skills Skill AUTOMATION TESTING Highest Education No data available CERTIFICATION No data available Working Language No data available JOB DESCRIPTION New JD TECHNICAL SKILLS 1. Create the End to End automation test cases in HIL setup 2. Manage the test cases based on the requirement from functional and end to end testing team 3. Execute the test cases and publish the test report 4. Automated the test case execution process Old JD. Required Experience · 4 - 6 years of strong Testing/QA experience. · End User Testing (User experience) · Positive & Negative Scenario · Should know Smoke Testing, Black box Testing, White box Testing. · Good to Know : Android Testing · Automobile domain( Embedded, CAN, CANoe, Infotainment, Validation, Bluetooth, IOT, IVI)

Posted 4 days ago

Apply

5.0 - 8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Description GPP Database Link (https://cummins365.sharepoint.com/sites/CS38534/) Leads projects for design, development and maintenance of a data and analytics platform. Effectively and efficiently process, store and make data available to analysts and other consumers. Works with key business stakeholders, IT experts and subject-matter experts to plan, design and deliver optimal analytics and data science solutions. Works on one or many product teams at a time. Key Responsibilities Designs and automates deployment of our distributed system for ingesting and transforming data from various types of sources (relational, event-based, unstructured). Designs and implements framework to continuously monitor and troubleshoot data quality and data integrity issues. Implements data governance processes and methods for managing metadata, access, retention to data for internal and external users. Designs and provide guidance on building reliable, efficient, scalable and quality data pipelines with monitoring and alert mechanisms that combine a variety of sources using ETL/ELT tools or scripting languages. Designs and implements physical data models to define the database structure. Optimizing database performance through efficient indexing and table relationships. Participates in optimizing, testing, and troubleshooting of data pipelines. Designs, develops and operates large scale data storage and processing solutions using different distributed and cloud based platforms for storing data (e.g. Data Lakes, Hadoop, Hbase, Cassandra, MongoDB, Accumulo, DynamoDB, others). Uses innovative and modern tools, techniques and architectures to partially or completely automate the most-common, repeatable and tedious data preparation and integration tasks in order to minimize manual and error-prone processes and improve productivity. Assists with renovating the data management infrastructure to drive automation in data integration and management. Ensures the timeliness and success of critical analytics initiatives by using agile development technologies such as DevOps, Scrum, Kanban Coaches and develops less experienced team members. Responsibilities Competencies: System Requirements Engineering - Uses appropriate methods and tools to translate stakeholder needs into verifiable requirements to which designs are developed; establishes acceptance criteria for the system of interest through analysis, allocation and negotiation; tracks the status of requirements throughout the system lifecycle; assesses the impact of changes to system requirements on project scope, schedule, and resources; creates and maintains information linkages to related artifacts. Collaborates - Building partnerships and working collaboratively with others to meet shared objectives. Communicates effectively - Developing and delivering multi-mode communications that convey a clear understanding of the unique needs of different audiences. Customer focus - Building strong customer relationships and delivering customer-centric solutions. Decision quality - Making good and timely decisions that keep the organization moving forward. Data Extraction - Performs data extract-transform-load (ETL) activities from variety of sources and transforms them for consumption by various downstream applications and users using appropriate tools and technologies. Programming - Creates, writes and tests computer code, test scripts, and build scripts using algorithmic analysis and design, industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Quality Assurance Metrics - Applies the science of measurement to assess whether a solution meets its intended outcomes using the IT Operating Model (ITOM), including the SDLC standards, tools, metrics and key performance indicators, to deliver a quality product. Solution Documentation - Documents information and solution based on knowledge gained as part of product development activities; communicates to stakeholders with the goal of enabling improved productivity and effective knowledge transfer to others who were not originally part of the initial learning. Solution Validation Testing - Validates a configuration item change or solution using the Function's defined best practices, including the Systems Development Life Cycle (SDLC) standards, tools and metrics, to ensure that it works as designed and meets customer requirements. Data Quality - Identifies, understands and corrects flaws in data that supports effective information governance across operational business processes and decision making. Problem Solving - Solves problems and may mentor others on effective problem solving by using a systematic analysis process by leveraging industry standard methodologies to create problem traceability and protect the customer; determines the assignable cause; implements robust, data-based solutions; identifies the systemic root causes and ensures actions to prevent problem reoccurrence are implemented. Values differences - Recognizing the value that different perspectives and cultures bring to an organization. Education, Licenses, Certifications College, university, or equivalent degree in relevant technical discipline, or relevant equivalent experience required. This position may require licensing for compliance with export controls or sanctions regulations. Experience Intermediate experience in a relevant discipline area is required. Knowledge of the latest technologies and trends in data engineering are highly preferred and includes: 5-8 years of experince Familiarity analyzing complex business systems, industry requirements, and/or data regulations Background in processing and managing large data sets Design and development for a Big Data platform using open source and third-party tools SPARK, Scala/Java, Map-Reduce, Hive, Hbase, and Kafka or equivalent college coursework SQL query language Clustered compute cloud-based implementation experience Experience developing applications requiring large file movement for a Cloud-based environment and other data extraction tools and methods from a variety of sources Experience in building analytical solutions Intermediate Experiences In The Following Are Preferred Experience with IoT technology Experience in Agile software development Qualifications Work closely with business Product Owner to understand product vision. Play a key role across DBU Data & Analytics Power Cells to define, develop data pipelines for efficient data transport into Cummins Digital Core ( Azure DataLake, Snowflake). Collaborate closely with AAI Digital Core and AAI Solutions Architecture to ensure alignment of DBU project data pipeline design standards. Independently design, develop, test, implement complex data pipelines from transactional systems (ERP, CRM) to Datawarehouses, DataLake. Responsible for creation, maintenence and management of DBU Data & Analytics data engineering documentation and standard operating procedures (SOP). Take part in evaluation of new data tools, POCs and provide suggestions. Take full ownership of the developed data pipelines, providing ongoing support for enhancements and performance optimization. Proactively address and resolve issues that compromise data accuracy and usability. Preferred Skills Programming Languages: Proficiency in languages such as Python, Java, and/or Scala. Database Management: Expertise in SQL and NoSQL databases. Big Data Technologies: Experience with Hadoop, Spark, Kafka, and other big data frameworks. Cloud Services: Experience with Azure, Databricks and AWS cloud platforms. ETL Processes: Strong understanding of Extract, Transform, Load (ETL) processes. Data Replication: Working knowledge of replication technologies like Qlik Replicate is a plus API: Working knowledge of API to consume data from ERP, CRM

Posted 4 days ago

Apply

0 years

0 Lacs

Gurgaon, Haryana, India

Remote

Canonical is a leading provider of open source software and operating systems to the global enterprise and technology markets. Our platform, Ubuntu, is very widely used in breakthrough enterprise initiatives such as public cloud, data science, AI, engineering innovation and IoT. Our customers include the world's leading public cloud and silicon providers, and industry leaders in many sectors. The company is a pioneer of global distributed collaboration, with 1200+ colleagues in 75+ countries and very few office based roles. Teams meet two to four times yearly in person, in interesting locations around the world, to align on strategy and execution. The company is founder led, profitable and growing. We are hiring a Partner Sales Manager - Hewlett Packard Enterprise to be a key contributing team member within the growing IHV team. Large OEM or Independent Hardware Vendor (IHV) brands - HPE, Dell, IBM, Lenovo, Ericsson, Cisco, Fujitsu and many more - are major partners for Canonical. These companies build software-defined solutions to capitalize on global open source mandates and associated macro trends. Canonical's flagship product, Ubuntu, and its broader open source portfolio are key ingredients for these partners to realize their aspirations. Canonical represents the best platform for rapid open source innovation. The Partner Sales Manager will be responsible to build trusted relationships with HPE, increase Canonical market share and attach rate, evangelize the partnership and drive business interactions from across persona - from engineer to CxO. They will often run customer workshops focused on particular initiatives at that customer, attend sales events, give public presentations and participate in executive engagements as coordinated by the Senior Director. Location: This role will be based remotely in India The role entails Build strategic relationships and enable HPE teams on the partnership Build pipeline and transact opportunities through HPE Grow HPE's awareness of open source capabilities on Canonical Ubuntu Demonstrate a deep understanding of the Linux and cloud software ecosystem Deliver on targets, objectives and provide a voice of the partner Travel regularly - including internationally - to drive partnerships in person Align and support internal canonical field teams - identify, support, grow, transact Expand existing footprint with HPE customers with an aim to upsell to broader portfolio Support and contribute to broader strategy, initiatives and key campaigns as defined by the HPE Global Alliance Director What we are looking for in you Experience in alliance or indirect sales management roles Sales acumen, ability to build and manage a pipeline of business Autonomous, disciplined, hands-on, get-it-done mentality Ability to capture customer requirements, evaluate gaps, identify and create opportunities Passionate about Ubuntu products and mission Comfortable in fast-paced and high pressure environments with measurable goals Experience with Linux, virtualization, containers, and other cloud technologies Excellent communication and presentation skills Team player with superior accountability and customer support skills Credibility and working knowledge of HPE - its products, go-to-market motion, and field Experience managing cross-functional teams and track record of operational excellence Willingness to travel up to 4 times a year for internal events Hands on experience with SalesForce.com and Google Suite a plus What we offer colleagues We consider geographical location, experience, and performance in shaping compensation worldwide. We revisit compensation annually (and more often for graduates and associates) to ensure we recognize outstanding performance. In addition to base pay, we offer a performance-driven annual bonus or commission. We provide all team members with additional benefits which reflect our values and ideals. We balance our programs to meet local needs and ensure fairness globally. Distributed work environment with twice-yearly team sprints in person Personal learning and development budget of USD 2,000 per year Annual compensation review Recognition rewards Annual holiday leave Maternity and paternity leave Team Member Assistance Program & Wellness Platform Opportunity to travel to new locations to meet colleagues Priority Pass and travel upgrades for long-haul company events About Canonical Canonical is a pioneering tech firm at the forefront of the global move to open source. As the company that publishes Ubuntu, one of the most important open-source projects and the platform for AI, IoT, and the cloud, we are changing the world of software. We recruit on a global basis and set a very high standard for people joining the company. We expect excellence; in order to succeed, we need to be the best at what we do. Most colleagues at Canonical have worked from home since our inception in 2004. Working here is a step into the future and will challenge you to think differently, work smarter, learn new skills, and raise your game. Canonical is an equal opportunity employer We are proud to foster a workplace free from discrimination. Diversity of experience, perspectives, and background create a better work environment and better products. Whatever your identity, we will give your application fair consideration.

Posted 4 days ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Key Responsibilities Create engaging and on-brand graphics for a variety of media including web, print, social media, and presentations Collaborate with marketing, content, and product teams to design assets that meet business objectives Translate strategic direction into high-quality design within an established brand identity Prepare design files for both print and digital use Manage multiple projects and meet deadlines in a fast-paced environment Stay updated on industry trends, tools, and design best practices About Company: Velozity Global Solutions is not only a globally recognized IT company, it's a family representing togetherness for over two years of a successful journey. For Velozity, the definition of success is to transform innovative ideas of people to reality with the help of our tech expertise - this is what we as a team want to be remembered for. Our vision has led Velozity to become an emerging IT company in India & the USA for delivering industry-led mobility solutions. The goal is to empower clients and businesses by creating new possibilities leveraging the technologies of today and tomorrow with the utmost quality, satisfaction, and transparency. Our enthusiasm has led us to become a top IT company in India & the USA for delivering various industry-led mobility solutions in web and mobile application development domains, leveraging futuristic technologies like the Internet of Things (IoT), AI-ML, AR-VR, voice assistants, and voice skills, DevOps & cloud computing, etc.

Posted 4 days ago

Apply

0 years

0 Lacs

Mumbai Metropolitan Region

Remote

Canonical is a leading provider of open source software and operating systems to the global enterprise and technology markets. Our platform, Ubuntu, is very widely used in breakthrough enterprise initiatives such as public cloud, data science, AI, engineering innovation and IoT. Our customers include the world's leading public cloud and silicon providers, and industry leaders in many sectors. The company is a pioneer of global distributed collaboration, with 1200+ colleagues in 75+ countries and very few office based roles. Teams meet two to four times yearly in person, in interesting locations around the world, to align on strategy and execution. The company is founder led, profitable and growing. We are hiring a Partner Sales Manager - Hewlett Packard Enterprise to be a key contributing team member within the growing IHV team. Large OEM or Independent Hardware Vendor (IHV) brands - HPE, Dell, IBM, Lenovo, Ericsson, Cisco, Fujitsu and many more - are major partners for Canonical. These companies build software-defined solutions to capitalize on global open source mandates and associated macro trends. Canonical's flagship product, Ubuntu, and its broader open source portfolio are key ingredients for these partners to realize their aspirations. Canonical represents the best platform for rapid open source innovation. The Partner Sales Manager will be responsible to build trusted relationships with HPE, increase Canonical market share and attach rate, evangelize the partnership and drive business interactions from across persona - from engineer to CxO. They will often run customer workshops focused on particular initiatives at that customer, attend sales events, give public presentations and participate in executive engagements as coordinated by the Senior Director. Location: This role will be based remotely in India The role entails Build strategic relationships and enable HPE teams on the partnership Build pipeline and transact opportunities through HPE Grow HPE's awareness of open source capabilities on Canonical Ubuntu Demonstrate a deep understanding of the Linux and cloud software ecosystem Deliver on targets, objectives and provide a voice of the partner Travel regularly - including internationally - to drive partnerships in person Align and support internal canonical field teams - identify, support, grow, transact Expand existing footprint with HPE customers with an aim to upsell to broader portfolio Support and contribute to broader strategy, initiatives and key campaigns as defined by the HPE Global Alliance Director What we are looking for in you Experience in alliance or indirect sales management roles Sales acumen, ability to build and manage a pipeline of business Autonomous, disciplined, hands-on, get-it-done mentality Ability to capture customer requirements, evaluate gaps, identify and create opportunities Passionate about Ubuntu products and mission Comfortable in fast-paced and high pressure environments with measurable goals Experience with Linux, virtualization, containers, and other cloud technologies Excellent communication and presentation skills Team player with superior accountability and customer support skills Credibility and working knowledge of HPE - its products, go-to-market motion, and field Experience managing cross-functional teams and track record of operational excellence Willingness to travel up to 4 times a year for internal events Hands on experience with SalesForce.com and Google Suite a plus What we offer colleagues We consider geographical location, experience, and performance in shaping compensation worldwide. We revisit compensation annually (and more often for graduates and associates) to ensure we recognize outstanding performance. In addition to base pay, we offer a performance-driven annual bonus or commission. We provide all team members with additional benefits which reflect our values and ideals. We balance our programs to meet local needs and ensure fairness globally. Distributed work environment with twice-yearly team sprints in person Personal learning and development budget of USD 2,000 per year Annual compensation review Recognition rewards Annual holiday leave Maternity and paternity leave Team Member Assistance Program & Wellness Platform Opportunity to travel to new locations to meet colleagues Priority Pass and travel upgrades for long-haul company events About Canonical Canonical is a pioneering tech firm at the forefront of the global move to open source. As the company that publishes Ubuntu, one of the most important open-source projects and the platform for AI, IoT, and the cloud, we are changing the world of software. We recruit on a global basis and set a very high standard for people joining the company. We expect excellence; in order to succeed, we need to be the best at what we do. Most colleagues at Canonical have worked from home since our inception in 2004. Working here is a step into the future and will challenge you to think differently, work smarter, learn new skills, and raise your game. Canonical is an equal opportunity employer We are proud to foster a workplace free from discrimination. Diversity of experience, perspectives, and background create a better work environment and better products. Whatever your identity, we will give your application fair consideration.

Posted 4 days ago

Apply

0 years

0 Lacs

Thrissur, Kerala, India

On-site

Key Responsibilities Work on microprocessors and microcontrollers Engage in embedded firmware and hardware development Perform software and hardware training on Arduino, PIC, ARM, and Raspberry Pi Work on wireless communication & IoT About Company: Rapid Techs is a dynamic organization dedicated to the development of industrial and academic research in various streams of technology. We are one of the leading providers of industrial solutions and academic programs.

Posted 4 days ago

Apply

10.0 years

0 Lacs

Greater Kolkata Area

On-site

Vodafone Idea Limited is an Aditya Birla Group and Vodafone Group partnership. It is India’s leading telecom service provider. The Company provides pan India Voice and Data services across 2G, 3G and 4G platform. With the large spectrum portfolio to support the growing demand for data and voice, the company is committed to deliver delightful customer experiences and contribute towards creating a truly ‘Digital India’ by enabling millions of citizens to connect and build a better tomorrow. The Company is developing infrastructure to introduce newer and smarter technologies, making both retail and enterprise customers future ready with innovative offerings, conveniently accessible through an ecosystem of digital channels as well as extensive on-ground presence. The Company is listed on National Stock Exchange (NSE) and Bombay Stock Exchange (BSE) in India. We're proud to be an equal opportunity employer. At VIL, we know that diversity makes us stronger. We are committed to a collaborative, inclusive environment that encourages authenticity and fosters a sense of belonging. We strive for everyone to feel valued, connected and empowered to reach their potential and contribute their best. VIL's goal is to build and maintain a workforce that is diverse in experience and background but uniform in reflecting our Values of Passion, Boldness, Trust, Speed and Digital. Consequently, our recruiting efforts are directed towards attracting and retaining best and brightest talents. Our endeavour is to be First Choice for prospective employees. VIL ensures equal employment opportunity without discrimination or harassment based on race, colour, religion, creed, age, sex, sex stereotype, gender, gender identity or expression, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy, veteran or military service status, genetic information, or any other characteristic protected by law. VIL is an equal opportunity employer committed to diversifying its workforce. Role AGM- Regional Service & Collection Lead Function / Department Enterprise-Customer Service Location Kolkata, West Bengal Job Purpose Role purpose: To support & execute the service & collections strategy thereby ensuring benchmark levels are met across the customer life cycle for an identified set of enterprise accounts across all segments & collections across account categories, to drive net promoter score and CSAT index across segments, to drive cost optimization via digital aided channels, to proactively & reactively ring fence the customer base, to enhance revenue by creating stickiness through various CVM campaigns, to drive collections (receivables) through focused proactive & reactive measures, ensure that virtual service management teams are well supportedacross the region, act as the fast track intermediary and escalation point where a physical visit may be required & use analytics as a means to improve customer experience that can support the organisation’s vision & objectives Key accountabilities and decision ownership : Strategic v Define and execute strategic initiatives on service & collections including account coverage, level 2 customer responses, requests & complaints handling, compliance to standard servicing norms across segments to enhance competitive position in the region v Implement a plan to improve customer experience based on feedback based on customer VOC, RNPS, C-SAT scores etc. v Guide & motivate the team to act as a consultant, Innovate and bring appropriate changes in service delivery depending on market realities and demands. Core competencies, knowledge and experience [max 5]: v 10+ years of experience in leading a customer service team, with exposure to service assurance and partner management. v Experience in managing ‘C’ levels & customer facing roles v Proven track record in meeting service levels and NPS targetsin different situations. v Prior experience in B2B or Telecom B2B v Ability to manage in a dynamic, high growth, high uncertainty environment Operational v Compliance to standard servicing norms, monitor customer commitments, intervene proactively and act as an escalation point for virtual service manager, thereby ensuring minimum service level breaches v Proactive root cause analysis, review trending of statistical data and performance reports to identify recurrent issues & fixes v Revenue enhancement through service led upsell/ cross-sell measures & campaigns v Customer retention through focused proactive & reactive measures to control Voluntary, Involuntary & Value churn. v Motivate and direct the team to drive automation & digital agenda with customers to reduce cost to serve v Ensure that payments receivables are collected within the defined period for an identified bucket through various process enhancement thereby increasing incremental revenue from the existing base. v Build a strong feedback mechanism through continuous engagement with partner, internal stakeholders and customers – to review account performance, conduct audits on RNPS, quality aspects & processes v Manage the financial aspects by ensuring all contracted services are billed accurately and as per the contracted frequency, and that any issues preventing payment of invoices are resolved in a timely manner Core Competencies, Knowledge, Experience Must have technical / professional qualifications: Desired Competencies/ Skills v Powerful influencing/ negotiation skills .Effective communication & relationship management skills v Proven ability to function within a matrix organization. v Strong analytical skills & ability to balance conflicting business & customer interests. v Experience in handling CS, CVM & Collectionsin a B2B environment Developmental v Creating an environment of high engagement during change management, challenge & motivate the partner for higher accomplishments v Continuous training & certification on building capabilities, skills, competencies with specific focus on other LoBs (IoT, Cloud, FLX etc) Key performance indicators : 1) VIBS RNPS, CSAT& Key national programs 2) Operational KPI’s for customer engagement – service management, incident and escalation management. 3) Digital drive & self-service adoption resulting in cost optimization & reduced cost to serve 4) Customer Retention management, Revenue enhancement, Collections (identified receivables bucket) Direct reports - 3 RASM across East Vodafone Idea Limited (formerly Idea Cellular Limited) An Aditya Birla Group & Vodafone partnership

Posted 4 days ago

Apply

1.0 - 5.0 years

0 Lacs

chennai, tamil nadu

On-site

You are a STEM Instructor at MH Intellect Experience, based in the UAE as per project requirements. You will be responsible for conducting hands-on training sessions on microcontrollers, embedded systems, and programming. Your role will include delivering engaging sessions, facilitating student learning, guiding projects, preparing documentation, evaluating performance, and enhancing training content in collaboration with internal teams. To excel in this role, you must have a B.E. or B.Tech in related fields and a minimum of 1 year of teaching/training experience in STEM education. Proficiency in Python, C++, and JavaScript is required, along with hands-on experience in Arduino, Raspberry Pi, and circuit prototyping. Strong communication skills, willingness to relocate to the UAE, and the ability to simplify complex concepts are essential. As a STEM Instructor, you will be expected to teach programming in real-time project scenarios, using hardware and software tools. Preferred skills include familiarity with platforms like ESP32, NodeMCU, or micro:bit, experience in IoT, robotics, or 3D printing, and knowledge of block-based programming tools like Scratch. Your passion for STEM education, creativity in teaching methods, and adaptability to student needs will be key to your success in this role.,

Posted 4 days ago

Apply

0.0 - 31.0 years

2 - 3 Lacs

New Tippasandra, Bengaluru/Bangalore

On-site

Designation- Robotics & CodingTeachers(InSchool) Department-Teaching Industry-EdTech Selected Candidate's day-to-day responsibilities include: a. To conduct Robotics and Coding classes during the school hours daily in the allocated Innovation Lab for KG to STD 9 or as may be discussed. b. To inculcate the love of Artificial Intelligence and Robotics through well planned daily sessions. c. To report to the Innovation Lab daily during school hours as a full time staff for the school. d. To organize classes, promote hands-on learning and maintain the InnovationLab. e. To provide a report to HO daily with the summary of sessions conducted and feedback of the children. Who can apply? Only those candidates can apply who: a) Possess good communication skills. b) Excellent teaching qualities and skills c) Patient with kids and passionate about teaching d) Can make boring lessons fun and interactive e) Has a little subject knowledge in Science, Math and Logical Thinking f) Are available full time during school hours, daily. g) Have the zeal, enthusiasm and love for teaching. Preferred Qualification:MCA/BCA,BTech-Engineering,MTech,MSc(IT), Bsc(IT),Bsc(ComputerScience),Msc(ComputerScience), BE(Electronics and Communication) Coding Languages- Python/C/C++/Java/Java Script/Block Based Coding/MIT App Inventor(basics will also do) Robotics- Arduino/IOT/Breadboard/Tinkercad/ML/AI etc…(basics will also do)

Posted 4 days ago

Apply

0.0 - 31.0 years

1 - 2 Lacs

Devprayag

On-site

Designation- Robotics & CodingTeachers(InSchool) Department-Teaching Industry-EdTech Selected Candidate's day-to-day responsibilities include: a. To conduct Robotics and Coding classes during the school hours daily in the allocated Innovation Lab for KG to STD 9 or as may be discussed. b. To inculcate the love of Artificial Intelligence and Robotics through well planned daily sessions. c. To report to the Innovation Lab daily during school hours as a full time staff for the school. d. To organize classes, promote hands-on learning and maintain the InnovationLab. e. To provide a report to HO daily with the summary of sessions conducted and feedback of the children. Who can apply? Only those candidates can apply who: a) Possess good communication skills. b) Excellent teaching qualities and skills c) Patient with kids and passionate about teaching d) Can make boring lessons fun and interactive e) Has a little subject knowledge in Science, Math and Logical Thinking f) Are available full time during school hours, daily. g) Have the zeal, enthusiasm and love for teaching. Preferred Qualification:MCA/BCA,BTech-Engineering,MTech,MSc(IT), Bsc(IT),Bsc(ComputerScience),Msc(ComputerScience), BE(Electronics and Communication) Coding Languages- Python/C/C++/Java/Java Script/Block Based Coding/MIT App Inventor(basics will also do) Robotics- Arduino/IOT/Breadboard/Tinkercad/ML/AI etc…(basics will also do)

Posted 4 days ago

Apply

3.0 - 7.0 years

0 Lacs

faridabad, haryana

On-site

At ZENNER Aquamet India, we are shaping the utility landscape of India and neighbouring countries by offering cutting-edge metering solutions. Our metering solutions are becoming digital and smarter. To strengthen our talent pool, we are looking for an IoT Engineer who can contribute to this journey. If you are passionate about learning and contributing to emerging smart solutions, this may be a perfect opportunity for you. We are looking forward to speaking with you. Position: IoT Engineer Experience: 3-5 years Education: B-Tech in Electrical/Electronics/Telecommunication/Computer Science Location: Delhi NCR (Faridabad) Notice Period: ASAP Travelling: Yes, 3-4 times a month within India Work timing: Mon - Fri, 10.00 am to 6.30 pm Key Skills: - IoT - Water & Gas smart metering - Smart cities - Software and Services - LoRaWAN, ZigBee, Bluetooth, Wi-Fi, Sigfox, GSM/GPRS, 3G, 4G, 5G, NBIoT Technical Skills: - Good knowledge of IoT devices, sensors, gateways, and networking systems - Exposure to AMR/AMI and datalogging-based systems - Knowledge of Water meter, Gas meter, and other metering solutions - Hands-on experience in electronic device development/maintenance - Proficiency in reading project requirements and designing appropriate solutions - Proficiency in basic computer operations and tools such as Word, Excel, and PowerPoint - Experience with software tools to configure and integrate IoT sensors - Proficiency in using cloud-based solutions and web-based dashboards - Knowledge of meter interfaces based on Digital pulse, Analog mA/mV, RS485-based Modbus, MBus, NFC, Bluetooth, etc. - Basic understanding of networking technologies such as GPRS, 4G LTE, NB-IoT, LoRaWAN, etc. Project Implementation Skills: - Providing after-sales service and maintaining long-term relationships with customers - PAN India project survey and deployment - Expertise in company's products and solutions,

Posted 4 days ago

Apply

3.0 - 7.0 years

0 Lacs

haryana

On-site

As a Senior AI Engineer specializing in Computer Vision and Azure Cloud at INNOFarms.AI, a pioneering Agri DeepTech startup, you will play a crucial role in developing cutting-edge AI and Robotics solutions for the advancement of climate-resilient smart farming practices. Our mission at INNOFarms.AI is to revolutionize food production in urban and resource-constrained areas by leveraging AI, robotics, and market intelligence to create a modular and scalable infrastructure for indoor vertical farming and Controlled Environment Agriculture (CEA). By addressing key challenges such as crop unpredictability, labor intensity, and supply-demand alignment, we are committed to driving sustainable food production globally. Your responsibilities will include designing and implementing advanced computer vision systems for object detection, segmentation, and tracking, as well as building and managing scalable data pipelines from diverse field and sensor datasets. In addition, you will collaborate across disciplines to integrate computer vision models with robotics, drones, and IoT systems. Leveraging Azure Cloud services like IoT Hub, Functions, CosmosDB, and Event Hubs, you will deploy production-grade AI models and continuously optimize algorithms for improved speed, accuracy, and performance. To excel in this role, you should hold a Bachelor's or Master's degree in Computer Science, AI, or a related field, with at least 5 years of experience in AI, SaaS, Cloud, or IoT development. You must possess specialized expertise in computer vision, particularly in object detection, tracking, and segmentation, and be proficient in tools such as OpenCV, TensorFlow, PyTorch, and modern MLOps workflows. Familiarity with Azure Cloud services and experience in building image data pipelines and multi-head CV models are essential. A startup mindset, strong collaboration skills, and the ability to work in fast-paced environments are key attributes we are looking for in potential candidates. While not mandatory, a background in AgriTech, Smart Automation, or IoT ecosystems, along with knowledge of global data compliance, cybersecurity, and AI governance best practices, would be advantageous. Experience in mentoring junior engineers or leading small teams is a plus. If you are passionate about driving innovation in sustainable agriculture and eager to be part of a dynamic team shaping the future of food production, we invite you to join us at INNOFarms.AI. Your contribution will not only involve writing code but also be a driving force behind the AgriTech revolution. Take the first step towards transforming the agriculture industry by applying now and becoming a catalyst for positive change in global food systems.,

Posted 4 days ago

Apply

5.0 - 8.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Description GPP Database Link (https://cummins365.sharepoint.com/sites/CS38534/) Job Summary Leads projects for design, development and maintenance of a data and analytics platform. Effectively and efficiently process, store and make data available to analysts and other consumers. Works with key business stakeholders, IT experts and subject-matter experts to plan, design and deliver optimal analytics and data science solutions. Works on one or many product teams at a time. Key Responsibilities Designs and automates deployment of our distributed system for ingesting and transforming data from various types of sources (relational, event-based, unstructured). Designs and implements framework to continuously monitor and troubleshoot data quality and data integrity issues. Implements data governance processes and methods for managing metadata, access, retention to data for internal and external users. Designs and provide guidance on building reliable, efficient, scalable and quality data pipelines with monitoring and alert mechanisms that combine a variety of sources using ETL/ELT tools or scripting languages. Designs and implements physical data models to define the database structure. Optimizing database performance through efficient indexing and table relationships. Participates in optimizing, testing, and troubleshooting of data pipelines. Designs, develops and operates large scale data storage and processing solutions using different distributed and cloud based platforms for storing data (e.g. Data Lakes, Hadoop, Hbase, Cassandra, MongoDB, Accumulo, DynamoDB, others). Uses innovative and modern tools, techniques and architectures to partially or completely automate the most-common, repeatable and tedious data preparation and integration tasks in order to minimize manual and error-prone processes and improve productivity. Assists with renovating the data management infrastructure to drive automation in data integration and management. Ensures the timeliness and success of critical analytics initiatives by using agile development technologies such as DevOps, Scrum, Kanban Coaches and develops less experienced team members. Responsibilities Competencies: System Requirements Engineering - Uses appropriate methods and tools to translate stakeholder needs into verifiable requirements to which designs are developed; establishes acceptance criteria for the system of interest through analysis, allocation and negotiation; tracks the status of requirements throughout the system lifecycle; assesses the impact of changes to system requirements on project scope, schedule, and resources; creates and maintains information linkages to related artifacts. Collaborates - Building partnerships and working collaboratively with others to meet shared objectives. Communicates effectively - Developing and delivering multi-mode communications that convey a clear understanding of the unique needs of different audiences. Customer focus - Building strong customer relationships and delivering customer-centric solutions. Decision quality - Making good and timely decisions that keep the organization moving forward. Data Extraction - Performs data extract-transform-load (ETL) activities from variety of sources and transforms them for consumption by various downstream applications and users using appropriate tools and technologies. Programming - Creates, writes and tests computer code, test scripts, and build scripts using algorithmic analysis and design, industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Quality Assurance Metrics - Applies the science of measurement to assess whether a solution meets its intended outcomes using the IT Operating Model (ITOM), including the SDLC standards, tools, metrics and key performance indicators, to deliver a quality product. Solution Documentation - Documents information and solution based on knowledge gained as part of product development activities; communicates to stakeholders with the goal of enabling improved productivity and effective knowledge transfer to others who were not originally part of the initial learning. Solution Validation Testing - Validates a configuration item change or solution using the Function's defined best practices, including the Systems Development Life Cycle (SDLC) standards, tools and metrics, to ensure that it works as designed and meets customer requirements. Data Quality - Identifies, understands and corrects flaws in data that supports effective information governance across operational business processes and decision making. Problem Solving - Solves problems and may mentor others on effective problem solving by using a systematic analysis process by leveraging industry standard methodologies to create problem traceability and protect the customer; determines the assignable cause; implements robust, data-based solutions; identifies the systemic root causes and ensures actions to prevent problem reoccurrence are implemented. Values differences - Recognizing the value that different perspectives and cultures bring to an organization. Education, Licenses, Certifications College, university, or equivalent degree in relevant technical discipline, or relevant equivalent experience required. This position may require licensing for compliance with export controls or sanctions regulations. Experience Intermediate experience in a relevant discipline area is required. Knowledge of the latest technologies and trends in data engineering are highly preferred and includes: 5-8 years of experince Familiarity analyzing complex business systems, industry requirements, and/or data regulations Background in processing and managing large data sets Design and development for a Big Data platform using open source and third-party tools SPARK, Scala/Java, Map-Reduce, Hive, Hbase, and Kafka or equivalent college coursework SQL query language Clustered compute cloud-based implementation experience Experience developing applications requiring large file movement for a Cloud-based environment and other data extraction tools and methods from a variety of sources Experience in building analytical solutions Intermediate Experiences In The Following Are Preferred Experience with IoT technology Experience in Agile software development Qualifications Work closely with business Product Owner to understand product vision. 2) Play a key role across DBU Data & Analytics Power Cells to define, develop data pipelines for efficient data transport into Cummins Digital Core ( Azure DataLake, Snowflake). 3) Collaborate closely with AAI Digital Core and AAI Solutions Architecture to ensure alignment of DBU project data pipeline design standards. 4) Independently design, develop, test, implement complex data pipelines from transactional systems (ERP, CRM) to Datawarehouses, DataLake. 5) Responsible for creation, maintenence and management of DBU Data & Analytics data engineering documentation and standard operating procedures (SOP). 6) Take part in evaluation of new data tools, POCs and provide suggestions. 7) Take full ownership of the developed data pipelines, providing ongoing support for enhancements and performance optimization. 8) Proactively address and resolve issues that compromise data accuracy and usability. Preferred Skills Programming Languages: Proficiency in languages such as Python, Java, and/or Scala. Database Management: Expertise in SQL and NoSQL databases. Big Data Technologies: Experience with Hadoop, Spark, Kafka, and other big data frameworks. Cloud Services: Experience with Azure, Databricks and AWS cloud platforms. ETL Processes: Strong understanding of Extract, Transform, Load (ETL) processes. Data Replication: Working knowledge of replication technologies like Qlik Replicate is a plus API: Working knowledge of API to consume data from ERP, CRM Job Systems/Information Technology Organization Cummins Inc. Role Category Remote Job Type Exempt - Experienced ReqID 2417810 Relocation Package Yes

Posted 4 days ago

Apply

5.0 - 8.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Description GPP Database Link (https://cummins365.sharepoint.com/sites/CS38534/) Job Summary Leads projects for design, development and maintenance of a data and analytics platform. Effectively and efficiently process, store and make data available to analysts and other consumers. Works with key business stakeholders, IT experts and subject-matter experts to plan, design and deliver optimal analytics and data science solutions. Works on one or many product teams at a time. Key Responsibilities Designs and automates deployment of our distributed system for ingesting and transforming data from various types of sources (relational, event-based, unstructured). Designs and implements framework to continuously monitor and troubleshoot data quality and data integrity issues. Implements data governance processes and methods for managing metadata, access, retention to data for internal and external users. Designs and provide guidance on building reliable, efficient, scalable and quality data pipelines with monitoring and alert mechanisms that combine a variety of sources using ETL/ELT tools or scripting languages. Designs and implements physical data models to define the database structure. Optimizing database performance through efficient indexing and table relationships. Participates in optimizing, testing, and troubleshooting of data pipelines. Designs, develops and operates large scale data storage and processing solutions using different distributed and cloud based platforms for storing data (e.g. Data Lakes, Hadoop, Hbase, Cassandra, MongoDB, Accumulo, DynamoDB, others). Uses innovative and modern tools, techniques and architectures to partially or completely automate the most-common, repeatable and tedious data preparation and integration tasks in order to minimize manual and error-prone processes and improve productivity. Assists with renovating the data management infrastructure to drive automation in data integration and management. Ensures the timeliness and success of critical analytics initiatives by using agile development technologies such as DevOps, Scrum, Kanban Coaches and develops less experienced team members. Responsibilities Competencies: System Requirements Engineering - Uses appropriate methods and tools to translate stakeholder needs into verifiable requirements to which designs are developed; establishes acceptance criteria for the system of interest through analysis, allocation and negotiation; tracks the status of requirements throughout the system lifecycle; assesses the impact of changes to system requirements on project scope, schedule, and resources; creates and maintains information linkages to related artifacts. Collaborates - Building partnerships and working collaboratively with others to meet shared objectives. Communicates effectively - Developing and delivering multi-mode communications that convey a clear understanding of the unique needs of different audiences. Customer focus - Building strong customer relationships and delivering customer-centric solutions. Decision quality - Making good and timely decisions that keep the organization moving forward. Data Extraction - Performs data extract-transform-load (ETL) activities from variety of sources and transforms them for consumption by various downstream applications and users using appropriate tools and technologies. Programming - Creates, writes and tests computer code, test scripts, and build scripts using algorithmic analysis and design, industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Quality Assurance Metrics - Applies the science of measurement to assess whether a solution meets its intended outcomes using the IT Operating Model (ITOM), including the SDLC standards, tools, metrics and key performance indicators, to deliver a quality product. Solution Documentation - Documents information and solution based on knowledge gained as part of product development activities; communicates to stakeholders with the goal of enabling improved productivity and effective knowledge transfer to others who were not originally part of the initial learning. Solution Validation Testing - Validates a configuration item change or solution using the Function's defined best practices, including the Systems Development Life Cycle (SDLC) standards, tools and metrics, to ensure that it works as designed and meets customer requirements. Data Quality - Identifies, understands and corrects flaws in data that supports effective information governance across operational business processes and decision making. Problem Solving - Solves problems and may mentor others on effective problem solving by using a systematic analysis process by leveraging industry standard methodologies to create problem traceability and protect the customer; determines the assignable cause; implements robust, data-based solutions; identifies the systemic root causes and ensures actions to prevent problem reoccurrence are implemented. Values differences - Recognizing the value that different perspectives and cultures bring to an organization. Education, Licenses, Certifications College, university, or equivalent degree in relevant technical discipline, or relevant equivalent experience required. This position may require licensing for compliance with export controls or sanctions regulations. Experience Intermediate experience in a relevant discipline area is required. Knowledge of the latest technologies and trends in data engineering are highly preferred and includes: 5-8 years of experience Familiarity analyzing complex business systems, industry requirements, and/or data regulations Background in processing and managing large data sets Design and development for a Big Data platform using open source and third-party tools SPARK, Scala/Java, Map-Reduce, Hive, Hbase, and Kafka or equivalent college coursework SQL query language Clustered compute cloud-based implementation experience Experience developing applications requiring large file movement for a Cloud-based environment and other data extraction tools and methods from a variety of sources Experience in building analytical solutions Intermediate Experiences In The Following Are Preferred Experience with IoT technology Experience in Agile software development Qualifications Work closely with business Product Owner to understand product vision. 2) Play a key role across DBU Data & Analytics Power Cells to define, develop data pipelines for efficient data transport into Cummins Digital Core ( Azure DataLake, Snowflake). 3) Collaborate closely with AAI Digital Core and AAI Solutions Architecture to ensure alignment of DBU project data pipeline design standards. 4) Independently design, develop, test, implement complex data pipelines from transactional systems (ERP, CRM) to Datawarehouses, DataLake. 5) Responsible for creation, maintenence and management of DBU Data & Analytics data engineering documentation and standard operating procedures (SOP). 6) Take part in evaluation of new data tools, POCs and provide suggestions. 7) Take full ownership of the developed data pipelines, providing ongoing support for enhancements and performance optimization. 8) Proactively address and resolve issues that compromise data accuracy and usability. Preferred Skills Programming Languages: Proficiency in languages such as Python, Java, and/or Scala. Database Management: Expertise in SQL and NoSQL databases. Big Data Technologies: Experience with Hadoop, Spark, Kafka, and other big data frameworks. Cloud Services: Experience with Azure, Databricks and AWS cloud platforms. ETL Processes: Strong understanding of Extract, Transform, Load (ETL) processes. Data Replication: Working knowledge of replication technologies like Qlik Replicate is a plus API: Working knowledge of API to consume data from ERP, CRM Job Systems/Information Technology Organization Cummins Inc. Role Category Remote Job Type Exempt - Experienced ReqID 2417809 Relocation Package Yes

Posted 4 days ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

On-site

We are Cooper Lighting Solutions. We build forward-thinking lighting solutions that make people’s lives safer, while making buildings, homes and cities smarter and more sustainable. We deliver an industry-leading portfolio of indoor and outdoor lighting, lighting controls and smart lighting systems. Cooper Lighting Solutions is a business unit of Signify, the world leader in lighting. Together, we have a shared purpose to unlock the extraordinary potential of light for brighter lives and a better world. Working as a Full Stack Engineer at Signify is an exciting and dynamic opportunity. You will be working with the latest back-end technologies on IoT platforms. Our globally diverse team, which consists of individuals from various backgrounds and nationalities, thrives on peer learning, reinforcing the idea that we are stronger together. We are looking for a full-stack Engineer who will take responsibility for understanding product requirements, as well as the design, development, and testing of back-end software for our lighting products. As part of the Signify Connected Systems team, your role will involve building IoT-based smart lighting products. Therefore, problem-solving skills, a sense of ownership, accountability, and a strong drive for results are essential qualities that should be ingrained in your work ethic. What You’ll Do Enhance existing software and develop new software solutions that facilitate the launch of new enterprise products, functionalities, and services using various web technologies. Follow and improve software guidelines, UI design patterns, and style guides. Collaborate with other software development team members when working on large projects. Translate requirements into software that aligns with the project vision. Participate in defining both the functional and non-functional requirements of the system. Contribute to sprint planning, demos, and retrospectives. Engage in design and code reviews. What You’ll Need Bachelor's degree in engineering with 3 to 6 years of experience in developing web-based applications using J2EE, Spring, and Spring Boot, as well as Angular 2+. Demonstrated experience in leading a team is essential. Proficient in producing and consuming REST web services and familiar with microservices architecture. Strong understanding of web application architecture and proficient in Spring framework & Linux. Excellent knowledge of database concepts, data modelling tools, and the use of relational DBMS. Experienced in working with MySQL, PostgreSQL, and Tomcat technologies and tools. Skilled in creating cross-browser user interfaces using HTML5, CSS3, JavaScript, and JavaScript frameworks (such as Angular, jQuery, and Bootstrap), along with JSON and AJAX. Knowledge of RESTful web services is required. Experience with LESS, Sass, Gulp, or Grunt is a plus. A thorough understanding of the Document Object Model (DOM) is also required. Good grasp of Agile development methodologies, as well as UML, SDL, and CMM. Hands-on experience implementing automated testing for JavaScript technologies, including unit tests and functional tests using frameworks like Jasmine, Protractor, and Karma. What You’ll Get In Return… Competitive salary depending on experience An extensive set of tools to drive your career, such as a personalized learning platform, free training and coaching Opportunity to buy Signify products with a discount What We Promise We’re committed to the continuous development of our employees, using our learning to shape the future of light and create a sustainable future. Join the undisputed leader in the lighting industry and be part of our diverse global team.

Posted 4 days ago

Apply

4.0 - 5.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Description GPP Database Link (https://cummins365.sharepoint.com/sites/CS38534/) Job Summary Supports, develops and maintains a data and analytics platform. Effectively and efficiently process, store and make data available to analysts and other consumers. Works with the Business and IT teams to understand the requirements to best leverage the technologies to enable agile data delivery at scale. Key Responsibilities Implements and automates deployment of our distributed system for ingesting and transforming data from various types of sources (relational, event-based, unstructured). Implements methods to continuously monitor and troubleshoot data quality and data integrity issues. Implements data governance processes and methods for managing metadata, access, retention to data for internal and external users. Develops reliable, efficient, scalable and quality data pipelines with monitoring and alert mechanisms that combine a variety of sources using ETL/ELT tools or scripting languages. Develops physical data models and implements data storage architectures as per design guidelines. Analyzes complex data elements and systems, data flow, dependencies, and relationships in order to contribute to conceptual physical and logical data models. Participates in testing and troubleshooting of data pipelines. Develops and operates large scale data storage and processing solutions using different distributed and cloud based platforms for storing data (e.g. Data Lakes, Hadoop, Hbase, Cassandra, MongoDB, Accumulo, DynamoDB, others). Uses agile development technologies, such as DevOps, Scrum, Kanban and continuous improvement cycle, for data driven application. Responsibilities Competencies: System Requirements Engineering - Uses appropriate methods and tools to translate stakeholder needs into verifiable requirements to which designs are developed; establishes acceptance criteria for the system of interest through analysis, allocation and negotiation; tracks the status of requirements throughout the system lifecycle; assesses the impact of changes to system requirements on project scope, schedule, and resources; creates and maintains information linkages to related artifacts. Collaborates - Building partnerships and working collaboratively with others to meet shared objectives. Communicates effectively - Developing and delivering multi-mode communications that convey a clear understanding of the unique needs of different audiences. Customer focus - Building strong customer relationships and delivering customer-centric solutions. Decision quality - Making good and timely decisions that keep the organization moving forward. Data Extraction - Performs data extract-transform-load (ETL) activities from variety of sources and transforms them for consumption by various downstream applications and users using appropriate tools and technologies. Programming - Creates, writes and tests computer code, test scripts, and build scripts using algorithmic analysis and design, industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Quality Assurance Metrics - Applies the science of measurement to assess whether a solution meets its intended outcomes using the IT Operating Model (ITOM), including the SDLC standards, tools, metrics and key performance indicators, to deliver a quality product. Solution Documentation - Documents information and solution based on knowledge gained as part of product development activities; communicates to stakeholders with the goal of enabling improved productivity and effective knowledge transfer to others who were not originally part of the initial learning. Solution Validation Testing - Validates a configuration item change or solution using the Function's defined best practices, including the Systems Development Life Cycle (SDLC) standards, tools and metrics, to ensure that it works as designed and meets customer requirements. Data Quality - Identifies, understands and corrects flaws in data that supports effective information governance across operational business processes and decision making. Problem Solving - Solves problems and may mentor others on effective problem solving by using a systematic analysis process by leveraging industry standard methodologies to create problem traceability and protect the customer; determines the assignable cause; implements robust, data-based solutions; identifies the systemic root causes and ensures actions to prevent problem reoccurrence are implemented. Values differences - Recognizing the value that different perspectives and cultures bring to an organization. Education, Licenses, Certifications College, university, or equivalent degree in relevant technical discipline, or relevant equivalent experience required. This position may require licensing for compliance with export controls or sanctions regulations. Experience 4-5 Years of experience. Relevant experience preferred such as working in a temporary student employment, intern, co-op, or other extracurricular team activities. Knowledge of the latest technologies in data engineering is highly preferred and includes: Exposure to Big Data open source SPARK, Scala/Java, Map-Reduce, Hive, Hbase, and Kafka or equivalent college coursework SQL query language Clustered compute cloud-based implementation experience Familiarity developing applications requiring large file movement for a Cloud-based environment Exposure to Agile software development Exposure to building analytical solutions Exposure to IoT technology Qualifications Work closely with business Product Owner to understand product vision. 2) Participate in DBU Data & Analytics Power Cells to define, develop data pipelines for efficient data transport into Cummins Digital Core ( Azure DataLake, Snowflake). 3) Collaborate closely with AAI Digital Core and AAI Solutions Architecture to ensure alignment of DBU project data pipeline design standards. 4) Work under limited supervision to design, develop, test, implement complex data pipelines from transactional systems (ERP, CRM) to Datawarehouses, DataLake. 5) Responsible for creation of DBU Data & Analytics data engineering documentation and standard operating procedures (SOP) with guidance and help from senior data engineers. 6) Take part in evaluation of new data tools, POCs with guidance and help from senior data engineers. 7) Take ownership of the developed data pipelines, providing ongoing support for enhancements and performance optimization under limited supervision. 8) Assist to resolve issues that compromise data accuracy and usability. Programming Languages: Proficiency in languages such as Python, Java, and/or Scala. Database Management: Intermediate level expertise in SQL and NoSQL databases. Big Data Technologies: Experience with Hadoop, Spark, Kafka, and other big data frameworks. Cloud Services: Experience with Azure, Databricks and AWS cloud platforms. ETL Processes: Strong understanding of Extract, Transform, Load (ETL) processes. API: Working knowledge of API to consume data from ERP, CRM Job Systems/Information Technology Organization Cummins Inc. Role Category Remote Job Type Exempt - Experienced ReqID 2417808 Relocation Package Yes

Posted 4 days ago

Apply

4.0 - 10.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Join our Team About the Role: Join a dynamic Telecom Operations team as a Change Manager, responsible for safeguarding the stability and performance of complex telecom networks. You’ll lead Change Advisory Board (CAB) activities, manage end-to-end change processes across RAN, Core, Transmission, and IP networks, and ensure compliance with ITIL standards and Ericsson/customer governance. Key Responsibilities: Lead CAB/CCB Activities: Organize and drive CAB meetings, assess change requests, and act as a key decision-maker for critical/emergency changes. Impact & Risk Analysis: Evaluate potential service disruptions (voice, data, IoT, 5G) and ensure KPI compliance (latency, call drop rate, etc.). Process Compliance: Ensure all changes follow SOPs, are tracked via ITSM tools (ServiceNow, Remedy), and meet audit and SLA standards. Reporting & Automation: Deliver accurate post-change reports and support automation through Excel VBA, Python, or BI tools. Cross-Team Coordination: Work with Field Ops, GNOC, MSIP, and multi-vendor teams (Ericsson, Nokia, Huawei, Cisco) for seamless change execution. Security Compliance: Align changes with ISO 27001/NIST guidelines and enforce access controls and logging. Qualifications: Education: Bachelor’s in Electronics, Telecom, or Computer Engineering. Experience: 4- 10 years in Telecom Ops or Change Management; OSS/BSS and NFV exposure is a plus. Certifications: ITIL Foundation (mandatory); CCNA/JNCIA/Ericsson training preferred. Technical Skills: Proficient in telecom architecture, protocols (IP, MPLS, SNMP), ticketing/reporting tools. Soft Skills: Strong communication (English/Hindi), stakeholder management, and decision-making under pressure.

Posted 4 days ago

Apply

5.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Infineon Technologies Ahmedabad, Gujarat, India Posted on Jul 31, 2025 Apply now As a Software engineer, your responsibilities include collaborating and developing software systems to address and solve specific business problems within our organization. As a developer, you will play a key role in writing quality code and developing quality products Ultimately, he/she will work in the organization to identify problems and then work with the internal team to address those problems with innovative software solutions. Job Description In your new role you will: A bachelor’s or master’s degree in a relevant field like computer science or software engineering At least 5+ years of work experience as a product developer or a similar role Proficiency in programming languages (C#, .NET, Python, AngularJS,VueJS, ReactJS etc.) In-depth understanding of coding languages Hands-on experience in programming debugging skills Sound knowledge of various operating systems and databases Nice to have a semiconductor domain and life cycle. Deep understanding of software development methodologies and systems design Ability to work with cross-site teams Should have demonstrated prior experience in putting together Proof of Concepts for technology demonstration Strategic thinking and problem-solving skills Strong communication ability to communicate with the development team and the m Your Profile You are best equipped for this task if you have: Collaborating with other fellow developers and architects Developing, modifying software code as per the organization Being part of the development team and adhering to followindustry-standard best practices Provide guidance to fellow development team members Be a part and contribute to the developer community. Evaluating and improving the tools and frameworks used in software development. Ensure their timely completion of development. Collaborate with developers and other development teams to drive innovation.Inform Product owners and Architects about any issues with the current technical solutions being implemented Continually research the current and emerging technologies and propose changes wherever needed Your role would also require you to work with team members to identify issues and propose design and method Contact swati.gupta@infineon.com #WeAreIn for driving decarbonization and digitalization. As a global leader in semiconductor solutions in power systems and IoT, Infineon enables game-changing solutions for green and efficient energy, clean and safe mobility, as well as smart and secure IoT. Together, we drive innovation and customer success, while caring for our people and empowering them to reach ambitious goals. Be a part of making life easier, safer and greener. Are you in? We are on a journey to create the best Infineon for everyone. This means we embrace diversity and inclusion and welcome everyone for who they are. At Infineon, we offer a working environment characterized by trust, openness, respect and tolerance and are committed to give all applicants and employees equal opportunities. We base our recruiting decisions on the applicant´s experience and skills. Please let your recruiter know if they need to pay special attention to something Apply now See more open positions at Infineon Technologies

Posted 4 days ago

Apply

0 years

0 Lacs

Delhi, India

On-site

About The Company Tata Communications Redefines Connectivity with Innovation and IntelligenceDriving the next level of intelligence powered by Cloud, Mobility, Internet of Things, Collaboration, Security, Media services and Network services, we at Tata Communications are envisaging a New World of Communications Job Description Role Overview: Role We are an international telecommunications/commtech company and provides various services including, international and domestic voice services, MPLS, Internet transit, VoIP, IoT, Mobility, CPaaS, CaaS, managed security, cloud, subsea cable capacity and/or other cutting-edge commtech services. We are seeking a dynamic regulatory professional to join our Legal & Regulatory Affairs team. The role is designed to provide end-to-end support across two key focus areas: Dedicated regulatory advisory and support to the Product Office Operational regulatory responsibilities including litigation and compliance tracking Key Responsibilities: I. Regulatory Product Support Act as the primary regulatory liaison to the Product Office, working closely with Product, Network, Legal, and Technology teams. Provide regulatory risk assessments and compliance inputs during product conceptualization, design, and go-to-market stages. Advise on applicable licensing requirements, statutory obligations, and service rollout norms. Maintain a product-wise regulatory tracker and risk matrix. Provide guidance on OSP (Other Service Provider) regulations and DoT guidelines, and support product and operations teams in adhering to them. Liaison with Regulator for approvals and clarifications, if required. II. Regulatory Operations Assist in handling regulatory and legal proceedings before TRAI, DoT, TDSAT and other bodies. Support vetting of commercial documentation from a regulatory standpoint, including LOUs, LOAs, MSAs, customer agreements, etc. Track changes to regulatory frameworks and draft internal briefing notes or impact memos. Contribute to regulatory submissions, consultation responses, and DoT/TRAI filings. Maintain internal documentation for OSP compliance, UL license requirements, and support any audits or inspections. Provide support for RFPs, customer queries, or onboarding reviews from a regulatory perspective. Skills & Competencies Required Strong understanding of Indian telecom regulations, licensing frameworks, and digital sector compliance requirements Proven ability to interpret legal requirements and apply them to business scenarios, especially in product or technology contexts Excellent drafting, documentation, and stakeholder coordination skills Familiarity with regulatory bodies and procedures (e.g., TRAI, DoT, TDSAT) Proactive mindset, attention to detail, and ability to balance long-term strategy with day-to-day execution

Posted 4 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies