Jobs
Interviews

105 Load Jobs - Page 5

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

9.0 - 12.0 years

11 - 16 Lacs

pune

Hybrid

So, what’s the role all about? This position evaluates current and emerging technologies, collaborating with DevOps and other business units within the company to set and ensure implementation of best practices. The Cloud Network Engineer will work with AWS, Azure, GCP and other cloud providers including inContact's private cloud environments. This position requires strong experience with Cloud technologies; AWS, Azure and others along with proven knowledge in cloud specific networking, including VPC/VNET design, VPC/VNET peering, VPN Gateways, Cloud VPN, NAT Gateways, VGW, Cloud Load Balancers, Security Groups, Traffic Manager, Direct Peering, Direct connect/Cloud Interconnect/ ExpressRoute and other cloud related endpoints. How will you make an impact? Research and evaluation of Cloud technologies, to address current and future needs. Establishment of repeatable design strategies and automation. CloudFormation updates using JSON/YMAL, Resource Manager, Deployment Manager Design review of fellow architect and engineer designs/implementation plans Escalation point for help in deep analysis and problem solving when needed May be required to function as a technical lead on projects of any size as necessary. Communicates detailed technical information in both written and verbal form across a wide range of audiences, including business stakeholders, users, developers, project management, and other Collaborates with colleagues, customers, vendors, and other parties to understand and develop architectural solutions Develops a sound understanding of existing systems and processes, their strengths and limitations, and the current and future needs of the environment in which they exist. Provides vision on how they may be improved and developed. Understands and explains the interactions between systems, applications, and services within the environment, and evaluates the impact of changes or additions to the environment. Participates in the evaluation and/or selection of solutions or products, including requirements definition, vendor and product evaluations. Acts as a local expert for areas of domain expertise. Performs other duties as required. Acts as an internal consultant role across multiple business units, representing the Network team Have you got what it takes? 8+ years work experience within an internetworking environment. Experience with Cloud technologies: AWS, Azure, GCP Experience dealing with Infrastructure as code Scripting experience with JSON/YMAL for CloudFormation Expert Level experience with Palo Alto and F5 load balancers Expert Level experience with network switching and routing Extensive knowledge of networking technologies, topologies, and protocols. (TCP/IP, BGP, OSPF, SNMP, Multicast, VRRP, HSRP, switching technologies) Expert-level understanding of internetworking technologies and tools, including TCP/IP, netflow/sflow, access-control lists, policy routing, firewalls, peering, and DNS Significant OSPF and BGP design, implementation, and troubleshooting experience Experience with MPLS environments Experience with Authentication services such as TACACS+, RADIUS, and RSA SecurID Working knowledge of IPV6 Intermediate Visio skills Working knowledge of SIP Experience working with Management Systems and SNMP Excellent interpersonal, oral and written communication skills along with prior experience in a dynamic, project-oriented team environment. A demonstrated history of learning new technologies and adapting them to solve complex problems Must pay strong attention to detail and have good organizational skills. What’s in it for you? Join an ever-growing, market disrupting, global company where the teams – comprised of the best of the best – work in a fast-paced, collaborative, and creative environment! As the market leader, every day at NICE is a chance to learn and grow, and there are endless internal career opportunities across multiple roles, disciplines, domains, and locations. If you are passionate, innovative, and excited to constantly raise the bar, you may just be our next NICEr! Enjoy NICE-FLEX! At NICE, we work according to the NICE-FLEX hybrid model, which enables maximum flexibility: 2 days working from the office and 3 days of remote work, each week. Naturally, office days focus on face-to-face meetings, where teamwork and collaborative thinking generate innovation, new ideas, and a vibrant, interactive atmosphere. Requisition ID:7944 Reporting into: Manager, Cloud Operations Role Type: Individual Contributor

Posted Date not available

Apply

4.0 - 9.0 years

3 - 6 Lacs

gurugram, bengaluru

Work from Office

Job Summary: Supports, develops and maintains a data and analytics platform. Effectively and efficiently process, store and make data available to analysts and other consumers. Works with the Business and IT teams to understand the requirements to best leverage the technologies to enable agile data delivery at scale. Key Responsibilities: Implements and automates deployment of our distributed system for ingesting and transforming data from various types of sources (relational, event-based, unstructured). Implements methods to continuously monitor and troubleshoot data quality and data integrity issues. Implements data governance processes and methods for managing metadata, access, retention to data for internal and external users. Develops reliable, efficient, scalable and quality data pipelines with monitoring and alert mechanisms that combine a variety of sources using ETL/ELT tools or scripting languages. Develops physical data models and implements data storage architectures as per design guidelines. Analyzes complex data elements and systems, data flow, dependencies, and relationships in order to contribute to conceptual physical and logical data models. Participates in testing and troubleshooting of data pipelines. Develops and operates large scale data storage and processing solutions using different distributed and cloud based platforms for storing data (e.g. Data Lakes, Hadoop, Hbase, Cassandra, MongoDB, Accumulo, DynamoDB, others). Uses agile development technologies, such as DevOps, Scrum, Kanban and continuous improvement cycle, for data driven application. External Qualifications and Competencies Competencies: System Requirements Engineering - Uses appropriate methods and tools to translate stakeholder needs into verifiable requirements to which designs are developed; establishes acceptance criteria for the system of interest through analysis, allocation and negotiation; tracks the status of requirements throughout the system lifecycle; assesses the impact of changes to system requirements on project scope, schedule, and resources; creates and maintains information linkages to related artifacts. Collaborates - Building partnerships and working collaboratively with others to meet shared objectives. Communicates effectively - Developing and delivering multi-mode communications that convey a clear understanding of the unique needs of different audiences. Customer focus - Building strong customer relationships and delivering customer-centric solutions. Decision quality - Making good and timely decisions that keep the organization moving forward. Data Extraction - Performs data extract-transform-load (ETL) activitiesfrom variety of sources and transforms them for consumption by various downstream applications and users using appropriate tools and technologies. Programming - Creates, writes and tests computer code, test scripts, and build scripts using algorithmic analysis and design, industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Quality Assurance Metrics - Applies the science of measurement to assess whether a solution meets its intended outcomes using the IT Operating Model (ITOM), including the SDLC standards, tools, metrics and key performance indicators, to deliver a quality product. Solution Documentation - Documents information and solution based on knowledge gained as part of product development activities; communicates to stakeholders with the goal of enabling improved productivity and effective knowledge transfer to others who were not originally part of the initial learning. Solution Validation Testing - Validates a configuration item change or solution using the Function's defined best practices, including the Systems Development Life Cycle (SDLC) standards, tools and metrics, to ensure that it works as designed and meets customer requirements. Data Quality - Identifies, understands and corrects flaws in data that supports effective information governance across operational business processes and decision making. Problem Solving - Solves problems and may mentor others on effective problem solving by using a systematic analysis process by leveraging industry standard methodologies to create problem traceability and protect the customer; determines the assignable cause; implements robust, data-based solutions; identifies the systemic root causes and ensures actions to prevent problem reoccurrence are implemented. Values differences - Recognizing the value that different perspectives and cultures bring to an organization. Education, Licenses, Certifications: College, university, or equivalent degree in relevant technical discipline, or relevant equivalent experience required. This position may require licensing for compliance with export controls or sanctions regulations. Experience: 4-5 Years of experience. Relevant experience preferred such as working in a temporary student employment, intern, co-op, or other extracurricular team activities. Knowledge of the latest technologies in data engineering is highly preferred and includes: - Exposure to Big Data open source - SPARK, Scala/Java, Map-Reduce, Hive, Hbase, and Kafka or equivalent college coursework - SQL query language - Clustered compute cloud-based implementation experience - Familiarity developing applications requiring large file movement for a Cloud-based environment - Exposure to Agile software development - Exposure to building analytical solutions - Exposure to IoT technology Additional Responsibilities Unique to this Position 1) Work closely with business Product Owner to understand product vision. 2) Participate in DBU Data & Analytics Power Cells to define, develop data pipelines for efficient data transport into Cummins Digital Core ( Azure DataLake, Snowflake). 3) Collaborate closely with AAI Digital Core and AAI Solutions Architecture to ensure alignment of DBU project data pipeline design standards. 4) Work under limited supervision to design, develop, test, implement complex data pipelines from transactional systems (ERP, CRM) to Datawarehouses, DataLake. 5) Responsible for creation of DBU Data & Analytics data engineering documentation and standard operating procedures (SOP) with guidance and help from senior data engineers. 6) Take part in evaluation of new data tools, POCs with guidance and help from senior data engineers. 7) Take ownership of the developed data pipelines, providing ongoing support for enhancements and performance optimization under limited supervision. 8) Assist to resolve issues that compromise data accuracy and usability. 1. Programming Languages:Proficiency in languages such as Python, Java, and/or Scala. 2. Database Management:Intermediate level expertise in SQL and NoSQL databases. 3. Big Data Technologies:Experience with Hadoop, Spark, Kafka, and other big data frameworks. 4. Cloud Services:Experience with Azure, Databricks and AWS cloud platforms. 5. ETL Processes:Strong understanding of Extract, Transform, Load (ETL) processes. 6. API: Working knowledge of API to consume data from ERP, CRM

Posted Date not available

Apply

4.0 - 5.0 years

6 - 10 Lacs

pune

Work from Office

Job Summary: Supports, develops and maintains a data and analytics platform. Effectively and efficiently process, store and make data available to analysts and other consumers. Works with the Business and IT teams to understand the requirements to best leverage the technologies to enable agile data delivery at scale. Key Responsibilities: Implements and automates deployment of our distributed system for ingesting and transforming data from various types of sources (relational, event-based, unstructured). Implements methods to continuously monitor and troubleshoot data quality and data integrity issues. Implements data governance processes and methods for managing metadata, access, retention to data for internal and external users. Develops reliable, efficient, scalable and quality data pipelines with monitoring and alert mechanisms that combine a variety of sources using ETL/ELT tools or scripting languages. Develops physical data models and implements data storage architectures as per design guidelines. Analyzes complex data elements and systems, data flow, dependencies, and relationships in order to contribute to conceptual physical and logical data models. Participates in testing and troubleshooting of data pipelines. Develops and operates large scale data storage and processing solutions using different distributed and cloud based platforms for storing data (e.g. Data Lakes, Hadoop, Hbase, Cassandra, MongoDB, Accumulo, DynamoDB, others). Uses agile development technologies, such as DevOps, Scrum, Kanban and continuous improvement cycle, for data driven application. External Qualifications and Competencies Competencies: System Requirements Engineering - Uses appropriate methods and tools to translate stakeholder needs into verifiable requirements to which designs are developed; establishes acceptance criteria for the system of interest through analysis, allocation and negotiation; tracks the status of requirements throughout the system lifecycle; assesses the impact of changes to system requirements on project scope, schedule, and resources; creates and maintains information linkages to related artifacts. Collaborates - Building partnerships and working collaboratively with others to meet shared objectives. Communicates effectively - Developing and delivering multi-mode communications that convey a clear understanding of the unique needs of different audiences. Customer focus - Building strong customer relationships and delivering customer-centric solutions. Decision quality - Making good and timely decisions that keep the organization moving forward. Data Extraction - Performs data extract-transform-load (ETL) activitiesfrom variety of sources and transforms them for consumption by various downstream applications and users using appropriate tools and technologies. Programming - Creates, writes and tests computer code, test scripts, and build scripts using algorithmic analysis and design, industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Quality Assurance Metrics - Applies the science of measurement to assess whether a solution meets its intended outcomes using the IT Operating Model (ITOM), including the SDLC standards, tools, metrics and key performance indicators, to deliver a quality product. Solution Documentation - Documents information and solution based on knowledge gained as part of product development activities; communicates to stakeholders with the goal of enabling improved productivity and effective knowledge transfer to others who were not originally part of the initial learning. Solution Validation Testing - Validates a configuration item change or solution using the Function's defined best practices, including the Systems Development Life Cycle (SDLC) standards, tools and metrics, to ensure that it works as designed and meets customer requirements. Data Quality - Identifies, understands and corrects flaws in data that supports effective information governance across operational business processes and decision making. Problem Solving - Solves problems and may mentor others on effective problem solving by using a systematic analysis process by leveraging industry standard methodologies to create problem traceability and protect the customer; determines the assignable cause; implements robust, data-based solutions; identifies the systemic root causes and ensures actions to prevent problem reoccurrence are implemented. Values differences - Recognizing the value that different perspectives and cultures bring to an organization. Education, Licenses, Certifications: College, university, or equivalent degree in relevant technical discipline, or relevant equivalent experience required. This position may require licensing for compliance with export controls or sanctions regulations. Experience: 4-5 Years of experience. Relevant experience preferred such as working in a temporary student employment, intern, co-op, or other extracurricular team activities. Knowledge of the latest technologies in data engineering is highly preferred and includes: - Exposure to Big Data open source - SPARK, Scala/Java, Map-Reduce, Hive, Hbase, and Kafka or equivalent college coursework - SQL query language - Clustered compute cloud-based implementation experience - Familiarity developing applications requiring large file movement for a Cloud-based environment - Exposure to Agile software development - Exposure to building analytical solutions - Exposure to IoT technology Additional Responsibilities Unique to this Position 1) Work closely with business Product Owner to understand product vision. 2) Participate in DBU Data & Analytics Power Cells to define, develop data pipelines for efficient data transport into Cummins Digital Core ( Azure DataLake, Snowflake). 3) Collaborate closely with AAI Digital Core and AAI Solutions Architecture to ensure alignment of DBU project data pipeline design standards. 4) Work under limited supervision to design, develop, test, implement complex data pipelines from transactional systems (ERP, CRM) to Datawarehouses, DataLake. 5) Responsible for creation of DBU Data & Analytics data engineering documentation and standard operating procedures (SOP) with guidance and help from senior data engineers. 6) Take part in evaluation of new data tools, POCs with guidance and help from senior data engineers. 7) Take ownership of the developed data pipelines, providing ongoing support for enhancements and performance optimization under limited supervision. 8) Assist to resolve issues that compromise data accuracy and usability. 1. Programming Languages:Proficiency in languages such as Python, Java, and/or Scala. 2. Database Management:Intermediate level expertise in SQL and NoSQL databases. 3. Big Data Technologies:Experience with Hadoop, Spark, Kafka, and other big data frameworks. 4. Cloud Services:Experience with Azure, Databricks and AWS cloud platforms. 5. ETL Processes:Strong understanding of Extract, Transform, Load (ETL) processes. 6. API: Working knowledge of API to consume data from ERP, CRM

Posted Date not available

Apply

1.0 - 3.0 years

3 - 5 Lacs

pune

Work from Office

What you'll do: Understand performance KPIs, Develop and execute test scripts for one or more products in parallel Present and interpret findings, report defects and create detailed execution reports Review the Non-functional requirements with various application stream to identify the requirements for load test. Actively learn and use infrastructure and application performance monitoring tools to help analyze and isolate bottlenecks and perform analysis Coordinate Performance Testing activities between various teams on shared infrastructure and in accordance with schedules, ensuring compliance and procedural requirements are adhered to Work closely with application development, architecture, infrastructure and engineering groups for test and data planning, analysis, and for defining success criteria Provide insight into performance bottlenecks leveraging a variety of isolation techniques and tools Design modular load test scripts for various scenarios to allow dynamic updates to the test design Plan and execute customs design load test as per the requirement from various stakeholders Identify and collect all performance data and create actionable analysis to help resolve performance bottleneck Develop strong knowledge of monitoring tools like Dynatrace, Splunk Remain current with the performance testing methodologies ZS is a global firm; fluency in English is required What you'll bring: Bachelor's Degree in CS or EE or equivalent 1-3 years of Quality Engineering experience primarily in performance, scalability, reliability testing, preferably in a Cloud-based environment Proficiency in Object oriented programming languages, preferably C#, .NET, Java or Python Good knowledge of RDBMS, preferably SQL Server Good knowledge of Web Applications and n-tier applications Well versed with Performance testing approaches and methodologies Good knowledge of performance testing tools, such as Neoload, VSTS, load runner etc. Exposed to performance monitoring & diagnostic tools such as SQL profiler, PerfMon, Dynatrace, Splunk, New Relic etc . Good communication and interpersonal skills Experience with Agile/Scrum methodologies Strong verbal, written, analytical, presentation and communication skills. Ability to multi-task and balance competing requirements and work Effective organizational and problem-solving skills Knowledge of cloud technologies such as AWS and Big data platforms such as Spark & Hadoop is plus Perks & Benefits: ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth and professional development. Our robust skills development programs, multiple career progression options and internal mobility paths and collaborative culture empowers you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections. Travel: Travel is a requirement at ZS for client facing ZSers; business needs of your project and client are the priority. While some projects may be local, all client-facing ZSers should be prepared to travel as needed. Travel provides opportunities to strengthen client relationships, gain diverse experiences, and enhance professional growth by working in different environments and cultures. Considering applying? At ZS, we're building a diverse and inclusive company where people bring their passions to inspire life-changing impact and deliver better outcomes for all. We are most interested in finding the best candidate for the job and recognize the value that candidates with all backgrounds, including non-traditional ones, bring. If you are interested in joining us, we encourage you to apply even if you don't meet 100% of the requirements listed above. ZS is an equal opportunity employer and is committed to providing equal employment and advancement opportunities without regard to any class protected by applicable law. To Complete Your Application: Candidates must possess or be able to obtain work authorization for their intended country of employment.An on-line application, including a full set of transcripts (official or unofficial), is required to be considered. NO AGENCY CALLS, PLEASE. Find Out More At:

Posted Date not available

Apply

5.0 - 8.0 years

5 - 6 Lacs

mundra

Work from Office

• Handling tanker loading/unloading. • Raw material ROAD tanker/ISO tanker unloading in plants. • Unloading tankers in CCOE tank farm area like critical raw material, etc. • Repacking material from ISO to ISO • Repacking of material from IBC to ISO

Posted Date not available

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies