Jobs
Interviews

8 Python Scripts Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 8.0 years

2 - 5 Lacs

Bengaluru

Work from Office

Key Responsibilities: Develop and maintain automation scripts using the PyTest framework. Automate Windows application testing using WinAppDriver. Write basic Python scripts to support automation and test execution. Design and execute manual test cases and scenarios. Log, track, and report bugs and defects, ensuring clear documentation and communication with the development team. Participate in regular QA activities including test planning, review, and reporting.

Posted 6 days ago

Apply

3.0 - 7.0 years

0 Lacs

goa

On-site

You will be contributing to the enhancement of the way people live and work by intelligently connecting energy systems, buildings, and industries. Smart infrastructure provided by Siemens plays a pivotal role in creating a more connected and caring world where resources are valued, sustainable energy is delivered reliably and efficiently, and the impact on the world is considered. This infrastructure offers the flexibility needed for society to evolve and adapt to changing conditions. The convergence of technology and human ingenuity aims to harmonize with our surroundings and nurture our planet. Siemens" portfolio includes a wide range of grid control and automation solutions, low- and medium-voltage power distribution, building automation, fire safety, security systems, HVAC control, and energy solutions. Your responsibilities will include validating and troubleshooting Protection relay/distribution automation products, designing test cases using Power system simulators like Omicron/OCC scripting/TMW TCL scripts/Python scripts, and demonstrating proficiency in network communication protocols such as Modbus, DNP3, IEC 60870-5-103, and IEC61850. You will also be involved in product qualification/validation for embedded products and systems, utilizing software configuration management tools, defect tracking tools, and peer review processes. Additionally, expertise in test automation concepts using Control center framework or Selenium/Robot frameworks, understanding of voltage disturbances, power system reliability, grounding systems, and power system tolerances, along with strong documentation and writing skills, are key aspects of this role. This position is located in Goa, offering you the opportunity to collaborate with teams that have a significant impact on entire cities, countries, and the future landscape. Siemens is a global company with a diverse workforce of over 379,000 individuals working across more than 200 countries. We are committed to fostering equality and encourage applications from candidates representing various communities in terms of Gender, LGBTQ+, Abilities, and Ethnicity. At Siemens, employment decisions are made based on qualifications, merit, and business requirements. If you are curious, imaginative, and eager to contribute to shaping the future, we invite you to join us on this exciting journey. To learn more about Smart Infrastructure, visit: https://new.siemens.com/global/en/company/topic-areas/smart-infrastructure.html. Explore career opportunities at Siemens by visiting: www.siemens.com/careers.,

Posted 1 week ago

Apply

2.0 - 4.0 years

5 - 7 Lacs

Chennai

Work from Office

Job Description: We are looking for a motivated Junior Python Developer with 2-3 years of experience to join our growing development team. The ideal candidate will have a foundational understanding of Python development and some experience in managing and optimizing scripts for data loads. Familiarity with Python web technologies, including FastAPI, will be beneficial. As a Junior Python Developer, you will work alongside senior developers to enhance the performance and scalability of our applications and support our ongoing projects. Key Responsibilities: - Develop and maintain Python scripts for automating tasks and supporting internal operations. - Assist in building Python applications using FastAPI and other web technologies. - Contribute to the management and optimization of scripts for efficient data loads. - Collaborate with senior developers and cross-functional teams to implement features and services. - Participate in code reviews and ensure adherence to best coding practices. - Assist in debugging and troubleshooting application issues. - Contribute to testing and validating code to ensure functionality and performance. - Document development work, processes, and code changes. - Continuously learn and stay updated with emerging technologies and industry trends. Qualifications: - Bachelors degree in Computer Science, Information Technology, or a related field, or equivalent experience. - 1-2 years of professional experience in Python development. - Basic understanding of Python and experience in building Python-based applications. - Familiarity with FastAPI or similar Python web frameworks. - Understanding of managing and optimizing Python scripts for data loads is a plus. - Familiarity with relational and NoSQL databases (e.g., PostgreSQL, MongoDB) is a plus. - Knowledge of version control tools (e.g., Git). - Ability to learn quickly, be a team player, and contribute positively to a collaborative environment. - Strong problem-solving and troubleshooting skills.

Posted 2 weeks ago

Apply

10.0 - 15.0 years

0 - 0 Lacs

Navi Mumbai

Work from Office

Experience : 10 years Location: Onsite - Navi Mumbai Education: Any Science /BSC/B.Tech Or Diploma in Computer Science We are looking for a motivated Storage Engineer to join the Managed Services operations team to service our Banking Customers. You will provide Level 3 support to our customers by responding to Incidents, working on Service Requests, Problem Management, Infrastructure Improvements and planning and performing NetApp Ontap upgrades. Experience in supporting a varied clientele is essential as is a strong understanding of NetApp storage-based technologies. Job Requirements: To be successful in this role, you would need to have the following: Experienced in NetApp Storage (Deploying and administering NetApp Storage CDOT and 7-Mode, Troubleshooting Performance Issues, performing firmware and operating system upgrades, using storage management tools – OnCommand Unified Manager/ Performance Manager/ Insight, Managing backup integration with NetApp technologies). Proficient understanding of NetApp Storage concepts, related management tools and related Storage technologies (Snapshots, controller failover, provisioning, deduplication, cloning, Snapmirror, IntelliSnap) including monitoring and troubleshooting, administration, replication technologies, security hardening and performance tuning. Experience in other NetApp storage technologies like StorageGRID, E-Series, SolidFire would be highly regarded Experience with automation tools such as NetApp Workflow Automation, Ansible Experience in writing basic PowerShell/Python scripts would be beneficial. Strong understanding of Storage protocols and technologies (Fiber Channel, CIFS, NFS, iSCSI and S3 Object) Data Migration from 3rd party to NetApp, host / application-level migration. Expertise in Brocade or Cisco SAN switches administration and troubleshooting SAN connectivity. SAN zoning. SAN switches firmware upgrade Strong knowledge on FC and IP based storage replications Knowledge in ITIL methodologies: change, incident, problem and configuration management Working knowledge of monitoring platforms, WANs, SANs, backup and disaster recovery platforms. Experience in Windows and Linux System Administration. Strong customer communication and documentation skills. Understanding of other components in the infrastructure stack – compute, virtualization and networks: (NetApp/Cisco Flex pod architecture, Cisco UCS platform, IBM Blade Center, System X, VMWare vSphere, Microsoft Operating Systems and Applications, Commvault). Build and maintain strong relationships with internal and external stakeholders. Infrastructure design, build, deployment and disaster recovery testing. Positive, proactive, team-orientated attitude with a flexible and innovative approach to work. Education: Graduate, preferably engineering graduate degree with a minimum of 10 years of Storage experience. - NetApp Certification would be an added advantage - ITIL Fundamentals certification (desirable).

Posted 2 weeks ago

Apply

10.0 - 14.0 years

22 - 37 Lacs

Bengaluru

Work from Office

Principal Responsibilities: One of the primary responsibilities are IOC sweeps/ blocks/ investigations of hits. Assist with automating this task. End goal is for IR to receive high fidelity true positive hits and for the person in this role to assess trends of IOC hits and feed intel to the threat hunt workstream to prioritize hunts on those threat actors. While working towards IOC sweep automation, escalates to hunters when hits determined to be true positive and remediation actions are required or if advanced analysis is required. Daily CISO report (CTI Input) This report is sent out daily to our CISO and other Sr. Leadership/ workstreams regarding daily CTI news and its relevance to KPMG. The person in this role will be responsible for this daily. Assist U.S. CTI workstream SME with alerts/ investigations from CTI tools. Prefer experience with CTI tools such as ZeroFox (Brand abuse/ leaked credentials investigations), Flashpoint (Deep dark web investigations), Domain Tools (domain/ web investigations) and experience with a Threat Intelligence Platform (TIP) such as Threat Q. Assist with the assessment of Top 10 threat actors/ malware for the firm to prioritize on assessments/ hunts. Research and develop risk mitigating approaches and drive response and remediation Document processes and procedures in the form of playbooks and reference guides. Stay abreast of the latest information security controls, practices, techniques and capabilities in the marketplace. Lead internal skills development activities for information security personnel on the topic of cyber threat intelligence, by providing mentoring and by conducting knowledge sharing sessions Provide input to business cases and presentations to senior IT leadership of proposed security products and studies. Produce operating metrics and key performance indicators. Knowledge of all phases of incident response life cycle: analysis, containment, eradication, remediation, recovery Evaluate external threat intelligence sources related to zero-day attacks, exploit kits and malware to determine organizational risk. Qualifications: Knowledge/ experience in automating tasks (creating logic apps, powershell/ python scripts to automate workflows/ tasks). This is highly desirable skillset. Experience in security monitoring, security operations, and incident response activities; preferably within a professional services firm or similar environment Strong knowledge of incident response and crisis management; Ability to identify both tactical and strategic solutions Knowledge/ background with snort rules (reading and/or writing them). Knowledge of Microsoft KQL (writing queries/ creating workbooks are highly desirable). Experience with IT process definition and / or improvement Ability to coordinate, work with and gain the trust of business stakeholders, technical resources, and third-party vendors Strong verbal/written communication, with ability to effectively interact with individuals at all levels of responsibility and authority. Must be able to prioritize, delegate to support an environment driven by customer service and teamwork. Strong trouble-shooting and organizational skills and ability to work on multiple projects simultaneously. Ability to participate in resource planning processes based on defined organizational plans. Experience defining security monitoring rules, monitoring events, assessing risk, responding to incidents and providing security oversight related to the security features of IT tools supported by the IT operations teams Ability to coordinate, work with and gain the trust of business stakeholders, technical resources, and third-party vendors Strong verbal/written communication, with ability to effectively interact with individuals at all levels of responsibility and authority. Must be able to prioritize, delegate and foster the development of high-performance teams to lead/support an environment driven by customer service and team work. Strong trouble-shooting and organizational skills and ability to work on multiple projects simultaneously. Ability to participate in resource planning processes based on defined organizational plans. Experience developing/ utilizing SIEM queries for investigating IOCs within the network. Experience conducting analysis based on Deep Dark Web intelligence.

Posted 3 weeks ago

Apply

5.0 - 8.0 years

22 - 30 Lacs

Noida, Hyderabad, Bengaluru

Hybrid

Role: Data Engineer Exp: 5 to 8 Years Location: Bangalore, Noida, and Hyderabad (Hybrid, weekly 2 Days office must) NP: Immediate to 15 Days (Try to find only immediate joiners) Note: Candidate must have experience in Python, Kafka Streams, Pyspark, and Azure Databricks. Not looking for candidates who have only Exp in Pyspark and not in Python. Job Title: SSE Kafka, Python, and Azure Databricks (Healthcare Data Project) Experience: 5 to 8 years Role Overview: We are looking for a highly skilled with expertise in Kafka, Python, and Azure Databricks (preferred) to drive our healthcare data engineering projects. The ideal candidate will have deep experience in real-time data streaming, cloud-based data platforms, and large-scale data processing . This role requires strong technical leadership, problem-solving abilities, and the ability to collaborate with cross-functional teams. Key Responsibilities: Lead the design, development, and implementation of real-time data pipelines using Kafka, Python, and Azure Databricks . Architect scalable data streaming and processing solutions to support healthcare data workflows. Develop, optimize, and maintain ETL/ELT pipelines for structured and unstructured healthcare data. Ensure data integrity, security, and compliance with healthcare regulations (HIPAA, HITRUST, etc.). Collaborate with data engineers, analysts, and business stakeholders to understand requirements and translate them into technical solutions. Troubleshoot and optimize Kafka streaming applications, Python scripts, and Databricks workflows . Mentor junior engineers, conduct code reviews, and ensure best practices in data engineering . Stay updated with the latest cloud technologies, big data frameworks, and industry trends . Required Skills & Qualifications: 4+ years of experience in data engineering, with strong proficiency in Kafka and Python . Expertise in Kafka Streams, Kafka Connect, and Schema Registry for real-time data processing. Experience with Azure Databricks (or willingness to learn and adopt it quickly). Hands-on experience with cloud platforms (Azure preferred, AWS or GCP is a plus) . Proficiency in SQL, NoSQL databases, and data modeling for big data processing. Knowledge of containerization (Docker, Kubernetes) and CI/CD pipelines for data applications. Experience working with healthcare data (EHR, claims, HL7, FHIR, etc.) is a plus. Strong analytical skills, problem-solving mindset, and ability to lead complex data projects. Excellent communication and stakeholder management skills. Email: Sam@hiresquad.in

Posted 1 month ago

Apply

5.0 - 8.0 years

22 - 30 Lacs

Noida, Hyderabad, Bengaluru

Hybrid

Role: Data Engineer Exp: 5 to 8 Years Location: Bangalore, Noida, and Hyderabad (Hybrid, weekly 2 Days office must) NP: Immediate to 15 Days (Try to find only immediate joiners) Note: Candidate must have experience in Python, Kafka Streams, Pyspark, and Azure Databricks. Not looking for candidates who have only Exp in Pyspark and not in Python. Job Title: SSE Kafka, Python, and Azure Databricks (Healthcare Data Project) Experience: 5 to 8 years Role Overview: We are looking for a highly skilled with expertise in Kafka, Python, and Azure Databricks (preferred) to drive our healthcare data engineering projects. The ideal candidate will have deep experience in real-time data streaming, cloud-based data platforms, and large-scale data processing . This role requires strong technical leadership, problem-solving abilities, and the ability to collaborate with cross-functional teams. Key Responsibilities: Lead the design, development, and implementation of real-time data pipelines using Kafka, Python, and Azure Databricks . Architect scalable data streaming and processing solutions to support healthcare data workflows. Develop, optimize, and maintain ETL/ELT pipelines for structured and unstructured healthcare data. Ensure data integrity, security, and compliance with healthcare regulations (HIPAA, HITRUST, etc.). Collaborate with data engineers, analysts, and business stakeholders to understand requirements and translate them into technical solutions. Troubleshoot and optimize Kafka streaming applications, Python scripts, and Databricks workflows . Mentor junior engineers, conduct code reviews, and ensure best practices in data engineering . Stay updated with the latest cloud technologies, big data frameworks, and industry trends . Required Skills & Qualifications: 4+ years of experience in data engineering, with strong proficiency in Kafka and Python . Expertise in Kafka Streams, Kafka Connect, and Schema Registry for real-time data processing. Experience with Azure Databricks (or willingness to learn and adopt it quickly). Hands-on experience with cloud platforms (Azure preferred, AWS or GCP is a plus) . Proficiency in SQL, NoSQL databases, and data modeling for big data processing. Knowledge of containerization (Docker, Kubernetes) and CI/CD pipelines for data applications. Experience working with healthcare data (EHR, claims, HL7, FHIR, etc.) is a plus. Strong analytical skills, problem-solving mindset, and ability to lead complex data projects. Excellent communication and stakeholder management skills. Email: Sam@hiresquad.in

Posted 1 month ago

Apply

3 - 7 years

20 - 25 Lacs

Pune

Remote

1. Extract and transform data from Google BigQuery and other relevant data sources. 2. Utilize Python and libraries such as Pandas and NumPy to manipulate, clean, and analyze large datasets. 3. Develop and implement Python scripts to automate data extraction, processing, and analysis for comparison reports. 4. Design and execute queries in BigQuery to retrieve specific data sets required for comparison analysis.

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies