Home
Jobs

62804 Python Jobs - Page 27

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 8.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

We are seeking an experienced ETL Data Engineer with expertise in Informatica Intelligent Cloud Services (IICS) and Informatica PowerCenter to support our ongoing and upcoming projects. The ideal candidate will be responsible for designing, developing, and maintaining data integration processes using both IICS and PowerCenter. Proficiency in Oracle is essential, including hands-on experience in building, optimizing, and managing data solutions on the platform. The candidate should have the ability to handle tasks independently , demonstrating strong problem-solving skills and initiative in managing data integration projects. This role involves close collaboration with business stakeholders, data architects, and cross-functional teams to deliver effective data solutions that align with business objectives. Who you are: Basics Qualification: Education: Bachelors in computer/ IT or Similar Mandate Skills: ETL Data Engineer, IICS, Informatica PowerCenter, Nice to have: Unix

Posted 6 hours ago

Apply

4.0 - 8.0 years

8 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Job Purpose Evaluate the data governance framework and Power BI environment. Provide recommendations for enhancing data quality, and discoverability, and optimize Power BI performance. Desired Skills and experience Candidates should have a B.E./B.Tech/MCA/MBA in Finance, Information Systems, Computer Science or a related field 7+ years of experience as a Data and Cloud architecture with client stakeholders Understand and review PowerShell (PS), SSIS, Batch Scripts, and C# (.NET 3.0) codebases for data processes. Assess complexity of trigger migration across Active Batch (AB), Synapse, ADF, and Azure Databricks (ADB). Define usage of Azure SQL DW, SQL DB, and Data Lake (DL) for various workloads, proposing transitions where beneficial. Analyze data patterns for optimization, including direct raw-to-consumption loading and zone elimination (e.g., stg/app zones). Understand requirements for external tables (Lakehouse) Excellent communication skills, both written and verbal Extremely strong organizational and analytical skills with strong attention to detail Strong track record of excellent results delivered to internal and external clients Able to work independently without the needs for close supervision and collaboratively as part of cross-team efforts Experience with delivering projects within an agile environment Experience in project management and team management Key responsibilities include: Understand and review PowerShell (PS), SSIS, Batch Scripts, and C# (.NET 3.0) codebases for data processes. Assess the complexity of trigger migration across Active Batch (AB), Synapse, ADF, and Azure Databricks (ADB). Define usage of Azure SQL DW, SQL DB, and Data Lake (DL) for various workloads, proposing transitions where beneficial. Analyze data patterns for optimization, including direct raw-to-consumption loading and zone elimination (e.g., stg/app zones). Understand requirements for external tables (Lakehouse) Evaluate and ensure quality of deliverables within project timelines Develop a strong understanding of equity market domain knowledge Collaborate with domain experts and business stakeholders to understand business rules/logics Ensure effective, efficient, and continuous communication (written and verbally) with global stakeholders Independently troubleshoot difficult and complex issues on dev, test, UAT and production environments Responsible for end-to-end delivery of projects, coordination between client and internal offshore teams and manage client queries Demonstrate high attention to detail, should work in a dynamic environment whilst maintaining high quality standards, a natural aptitude to develop good internal working relationships and a flexible work ethic Responsible for Quality Checks and adhering to the agreed Service Level Agreement (SLA) / Turn Around Time (TAT)

Posted 6 hours ago

Apply

4.0 - 7.0 years

12 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

As a Senior Cloud Platform Back-End Engineer with a strong background in AWS tools and services, you will join the Data & AI Solutions - Engineering team in our Healthcare R&D business. Your expertise will enhance the development and continuous improvement of a critical AWS-Cloud-based analytics platform, supporting our R&D efforts in drug discovery. This role involves implementing the technical roadmap and maintaining existing functionalities. You will adapt to evolving technologies, manage infrastructure and security, design and implement new features, and oversee seamless deployment of updates. Additionally, you will implement strategies for data archival and optimize the data lifecycle processes for efficient storage management in compliance with regulations. Join a multicultural team working in agile methodologies with high autonomy. The role requires office presence at our Bangalore location. Who You Are: University degree in Computer Science, Engineering, or a related field Proficiency using Python, especially with the boto3 library to interact with AWS services programmatically, for infrastructure as a code with AWS CDK and AWS Lambdas Experience with API Development & Management by designing, developing, and managing APIs using AWS API Gateway and other relevant API frameworks. Strong understanding of AWS security best practices, IAM policies, encryption, auditing and regulatory compliance (e.g. GDPR). Experience with Application Performance Monitoring and tracing solutions like AWS CloudWatch, X-Ray, and OpenTelemetry. Proficiency in navigating and utilizing various AWS tools and services System design skills in cloud environment Experience with SQL and data integration into Snowflake Familiarity with Microsoft Entra ID for identity and access management Willingness to work in a multinational environment and cross-functional teams distributed between US, Europe (mostly, Germany) and India Sense of accountability and ownership, fast learner Fluency in English & excellent communication skills

Posted 6 hours ago

Apply

2.0 - 5.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Attend to Technical Calls inbound from customers on breakdown of system on site. Troubleshooting system issues/error/alert & alarm with sound understanding for maintenance of system/IP with logical and technical skills relevant to Lab Water Solution Product portfolio and significant water filtration knowledge. Inform inhouse for urgent clinical customer support in-site where priority dwells on critical scenario (as hospitals and GSA). Handle Work Order processed by Partners in Field (Field Service Engineers) on relevant fields necessary before closing the WO as valid for billing and unbilled covered task carried out in site. Push Billable Repair Quotes for scheduled Repair on system as they possess potential of an order. Chase Billable Work Order for payment to close before billing cycle. Renewal follow-up of Service contract as they provide ease of repair when maintained periodically by inhouse wrt subscriptions. With the above scope of responsibility, managing workflow of transactional activity. Creating effective communication on process during BCP and internal resource availability. Completing task within SLA agreed timeline and quality metrics and service objectives. Creating critical review on non GXP activities. Implementing changes aligned to change management and strategic requirements. Who You are: years of experience Strong Communication, technical knowledge on instrumentation and equipment, call handing, customer handling, troubleshooting of instrumentation. Preferred Requirements: Power BI, basic AI, teamwork. Problem solving. Stakeholder management.

Posted 6 hours ago

Apply

8.0 - 10.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

You will co-ordinate and direct the QA teams on daily activities and handle QA and automation for the Adobe Web platform hosting ~400 Healthcare and Corporate websites You will work with vendors, strategic partners, and internal resources to define and implement QA and Automation processes and ensure delivery of projects on-quality, on-budget, and on-time, and establish strong, effective, working relationships with the Business and App Tech team across the organization You will maintain a high degree of technical competence with the latest web technologies for penetration testing, platform security testing and vulnerability testing, Automation and QA processes Automation and QA Strategy Development: Designing and implementing automation and QA strategies that align with project goals and organizational standards. Team Leadership: Leading and mentoring a team of automation & QA engineers, providing guidance on best practices and technical solutions. Test Automation Framework Design: Developing and maintaining robust test automation frameworks to support various applications and ensure scalability. Collaboration with Stakeholders: Working closely with product owners, developers, and QA teams to identify automation opportunities and define testing requirements. Test Case Development: Creating, reviewing, and optimizing automated test cases to ensure comprehensive test coverage. Continuous Improvement: Analyzing automation processes and results to identify areas for improvement and implementing enhancements. Tool Evaluation and Selection: Researching and evaluating automation tools and technologies to ensure the right tools are used for specific projects. Reporting and Metrics: Generating reports on automation progress, defects, and test coverage to provide insights to stakeholders. Troubleshooting and Support: Assisting in troubleshooting issues related to automation scripts and providing support to team members as needed. Who you are: Overall 8 to 10 years of hands-on testing experience. 4-6 years of experience in using Selenium for web application testing, including writing and maintaining automated test scripts. 1 - 2 years of experience with Robot Framework for keyword-driven testing, creating test cases and maintaining automated test scripts Proficiency in scripting languages such as Java or Python to develop and enhance automation scripts. Experience in designing Automation(Selenium and TestNG) frameworks from the scratch Experience with Extent Reports for generating detailed and customizable test execution reports, enhancing test visibility.Hands-on experience in Jenkins for automating the build and deployment process, integrating automated tests into the CI/CD pipeline Experience with version control tools like bitbucket for managing code repositories and collaborating with team members. Experience in testing AEM-based applications, including content management workflows and component testing Specific information related to the position: Flexibility to attend critical meetings remotely across different time zones (Europe, North America, Latam)

Posted 6 hours ago

Apply

8.0 - 13.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

As a Sr Data Engineer in the Digital & Data team you will work hands-on to deliver and maintain the pipelines required by the business functions to derive value from their data For this, you will bring data from a varied landscape of source systems into our cloud-based analytics stack and implement necessary cleaning and pre-processing steps in close collaboration with our business customers Furthermore, you will work closely together with our teams to ensure that all data assets are governed according to the FAIR principles To keep the engineering team scalable, you and your peers will create reusable components, libraries, and infrastructure that will be used to accelerate the pace with which future use-cases can be delivered You will be part of a team dedicated to delivering state-of-the-art solutions for enabling data analytics use cases across the Healthcare sector of a leading, global Science & Technology company As such, you will have the unique opportunity to gain insight into our diverse business functions allowing you to expand your skills in various technical, scientific, and business domains Working in a project-based way covering a multitude of data domains and technological stacks, you will be able to significantly develop your skills and experience as a Data Engineer Who you are BE/M.Sc./PhD in Computer Science or related field and 8+ years of work experience in a relevant capacity Experience in working with cloud environments such as, Hadoop, AWS, GCP, and Azure. Experience with enforcing security controls and best practices to protect sensitive data within AWS data pipelines, including encryption, access controls, and auditing mechanisms. Agile mindset, a spirit of initiative, and desire to work hands-on together with your team Interest in solving challenging technical problems and developing the future data architecture that will enable the implementation of innovative data analytics use-cases Experience in leading small to medium-sized team. Experience in creating architectures for ETL processes for batch as well as streaming Ingestion Knowledge of designing and validating software stacks for GxP relevant contexts as well as working with PII data Familiarity with the data domains covering the Pharma value-chain (e.g. research, clinical, regulatory, manufacturing, supply chain, and commercial) Strong, hands-on experience in working with Python, Pyspark & R codebases, proficiency in additional programming languages (e.g. C/C++, Rust, Typescript, Java, ) is expected. Experience working with Apache Spark and the Hadoop ecosystem Working with heterogenous compute environments and multi-platform setups Experience in working with cloud environments such as, Hadoop, AWS, GCP, and Azure Basic knowledge of Statistics and Machine Learning algorithms is favorable This is the respective role description: The ability to easily find, access, and analyze data across an organization is key for every modern business to be able to efficiently make decisions, optimize processes, and to create new business models. The Data Architect plays a key role in unlocking this potential by defining and implementing a harmonized data architecture for Healthcare.

Posted 6 hours ago

Apply

5.0 - 8.0 years

7 - 11 Lacs

Gurugram

Work from Office

Naukri logo

Design, construct, and maintain scalable data management systems using Azure Databricks, ensuring they meet end-user expectations. Supervise the upkeep of existing data infrastructure workflows to ensure continuous service delivery. Create data processing pipelines utilizing Databricks Notebooks, Spark SQL, Python and other Databricks tools. Oversee and lead the module through planning, estimation, implementation, monitoring and tracking. Desired Skills and experience 5+ years of experience in software development using Python, PySpark and its frameworks. Proven experience as a Data Engineer with experience in Azure cloud. Experience implementing solutions using - Azure cloud services, Azure Data Factory, Azure Lake Gen 2, Azure Databases, Azure Data Fabric, API Gateway management, Azure Functions Design, build, test, and maintain highly scalable data management systems using Azure Databricks Strong SQL skills with RDMS or noSQL databases Experience with developing APIs using FastAPI or similar frameworks in Python Familiarity with the DevOps lifecycle (git, Jenkins, etc.), CI/CD processes Good understanding of ETL/ELT processes Experience in financial services industry, financial instruments, asset classes and market data are a plus. Assist stakeholders with data-related technical issues and support their data infrastructure needs. Develop and maintain documentation for data pipeline architecture, development processes, and data governance. Data Warehousing: In-depth knowledge of data warehousing concepts, architecture, and implementation, including experience with various data warehouse platforms. Extremely strong organizational and analytical skills with strong attention to detail Strong track record of excellent results delivered to internal and external clients. Excellent problem-solving skills, with ability to work independently or as part of team. Strong communication and interpersonal skills, with ability to effectively engage with both technical and non-technical stakeholders. Able to work independently without the needs for close supervision and collaboratively as part of cross-team efforts. Key responsibilities include: Interpret business requirements, either gathered or acquired. Work with internal resources as well as application vendors Designing, developing, and maintaining Data Bricks Solution and Relevant Data Quality rules Troubleshooting and resolving data related issues. Configuring and Creating Data models and Data Quality Rules to meet the needs of the customers. Hands on in handling Multiple Database platforms. Like Microsoft SQL Server, Oracle etc Reviewing and analyzing data from multiple internal and external sources Analyze existing PySpark/Python code and identify areas for optimization. Write new optimized SQL queries or Python Scripts to improve performance and reduce run time. Identify opportunities for efficiencies and innovative approaches to completing scope of work. Write clean, efficient, and well-documented code that adheres to best practices and Council IT coding standards. Maintenance and operation of existing custom codes processes Participate in team problem solving efforts and offer ideas to solve client issues. Query writing skills with ability to understand and implement changes to SQL functions and stored procedures. Effectively communicate with business and technology partners, peers and stakeholders Ability to deliver results under demanding timelines to real-world business problems. Ability to work independently and multi-task effectively. Configure system settings and options and execute unit/integration testing. Develop end-user Release Notes, training materials and deliver training to a broad user base. Identify and communicate areas for improvement. Demonstrate high attention to detail, should work in a dynamic environment whilst maintaining high quality standards, a natural aptitude to develop good internal working relationships and a flexible work ethic. Responsible for Quality Checks and adhering to the agreed Service Level Agreement (SLA) / Turn Around Time (TAT)

Posted 6 hours ago

Apply

8.0 - 13.0 years

13 - 18 Lacs

Bengaluru

Work from Office

Naukri logo

We are seeking a Senior Snowflake Developer/Architect will be responsible for designing, developing, and maintaining scalable data solutions that effectively meet the needs of our organization. The role will serve as a primary point of accountability for the technical implementation of the data flows, repositories and data-centric solutions in your area, translating requirements into efficient implementations. The data repositories, data flows and data-centric solutions you create will support a wide range of reporting, analytics, decision support and (Generative) AI solutions. Your Role: Implement and manage data modelling techniques, including OLTP, OLAP, and Data Vault 2.0 methodologies. Write optimized SQL queries for data extraction, transformation, and loading. Utilize Python for advanced data processing, automation tasks, and system integration. Be an advisor with your In-depth knowledge of Snowflake architecture, features, and best practices. Develop and maintain complex data pipelines and ETL processes in Snowflake. Collaborate with data architects, analysts, and stakeholders to design optimal and scalable data solutions. Automate DBT Jobs & build CI/CD pipelines using Azure DevOps for seamless deployment of data solutions. Ensure data quality, integrity, and compliance throughout the data lifecycle. Troubleshoot, optimize, and enhance existing data processes and queries for performance improvements. Document data models, processes, and workflows clearly for future reference and knowledge sharing. Build Data tests, Unit tests and mock data frameworks. Who You Are: Masters degree in computer science, Information Technology, or a related field. At least 3+ years of proven experience as a Snowflake Developer and minimum 8+ years of total experience with data modelling (OLAP & OLTP). Extensive hands-on experience in writing complex SQL queries and advanced Python, demonstrating proficiency in data manipulation and analysis for large data volumes. Strong understanding of data warehousing concepts, methodologies, and technologies with in-depth experience in data modelling techniques (OLTP, OLAP, Data Vault 2.0) Experience building data pipelines using DBT (Data Build Tool) for data transformation. Familiarity with advanced techniques for performance tuning methodologies in Snowflake including query optimization. Strong knowledge with CI/CD pipelines, preferably in Azure DevOps. Excellent problem-solving, analytical, and critical thinking skills. Strong communication, collaboration, and interpersonal skills. Knowledge of additional data technologies (e.g., AWS, Azure, GCP) is a plus. Knowledge of Infrastructure as Code (IAC) tools such as Terraform or cloud formation is a plus. Experience in leading projects or mentoring junior developers is advantageous.

Posted 6 hours ago

Apply

8.0 - 13.0 years

13 - 18 Lacs

Bengaluru

Work from Office

Naukri logo

We are seeking a Senior Snowflake Developer/Architect will be responsible for designing, developing, and maintaining scalable data solutions that effectively meet the needs of our organization. The role will serve as a primary point of accountability for the technical implementation of the data flows, repositories and data-centric solutions in your area, translating requirements into efficient implementations. The data repositories, data flows and data-centric solutions you create will support a wide range of reporting, analytics, decision support and (Generative) AI solutions. Your Role: Implement and manage data modelling techniques, including OLTP, OLAP, and Data Vault 2.0 methodologies. Write optimized SQL queries for data extraction, transformation, and loading. Utilize Python for advanced data processing, automation tasks, and system integration. Be an advisor with your In-depth knowledge of Snowflake architecture, features, and best practices. Develop and maintain complex data pipelines and ETL processes in Snowflake. Collaborate with data architects, analysts, and stakeholders to design optimal and scalable data solutions. Automate DBT Jobs & build CI/CD pipelines using Azure DevOps for seamless deployment of data solutions. Ensure data quality, integrity, and compliance throughout the data lifecycle. Troubleshoot, optimize, and enhance existing data processes and queries for performance improvements. Document data models, processes, and workflows clearly for future reference and knowledge sharing. Build Data tests, Unit tests and mock data frameworks. Who You Are: B achelors or masters degree in computer science, mathematics, or related fields. At least 8 years of experience as a data warehouse expert, data engineer or data integration specialist. In depth knowledge of Snowflake components including Security and Governance Proven experience in implementing complex data models (eg. OLTP , OLAP , Data vault) A strong understanding of ETL including end-to-end data flows, from ingestion to data modeling and solution delivery. Proven industry experience with DBT and JINJA scripts Strong proficiency in SQL, with additional knowledge of Python (i.e. pandas and PySpark) being advantageous. Familiarity with data & analytics solutions such as AWS (especially Glue, Lambda, DMS) is nice to have. Experience working with Azure Dev Ops and warehouse automation tools (eg. Coalesce) is a plus. Experience with Healthcare R&D is a plus. Excellent English communication skills, with the ability to effectively engage both with R&D scientists and software engineers. Experience working in virtual and agile teams.

Posted 6 hours ago

Apply

5.0 - 7.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Job Purpose Develop and execute test cases for both UI and API, with a focus on FSI trading workflows. Implement and utilize test management tools (e.g., X-Ray/JIRA). Desired Skills and experience Candidates should have a B.E./B.Tech/MCA/MBA in Finance, Information Systems, Computer Science or a related field 5+ years of experience in Quality Assurance (Testing) working with client stakeholders Significant experience of performing testing in the Financial Services Industry. Hands-on expertise in UI testing with Puppeteer. Strong experience in API testing using SOAPUI, Maven, and Jenkins in a CI/CD pipeline. Deep understanding of FSI trading platforms and tools (e.g., Polaris) and fixed income products. Proven ability to establish QA processes and frameworks in environments with minimal existing structure. Excellent problem-solving, analytical, and communication skills. Experience working on agile methodology, Jira, Confluence, etc. Excellent communication skills, both written and verbal Extremely strong organizational and analytical skills with strong attention to detail Strong track record of excellent results delivered to internal and external clients Able to work independently without the needs for close supervision and also collaboratively as part of cross-team efforts Experience with delivering projects within an agile environment Key responsibilities include: Establish and implement comprehensive QA strategies and test plans from scratch. Address immediate pain points in UI (Puppeteer) and API (SOAPUI, Maven, Jenkins) testing, including triage and framework improvement. Strong experience in SQL. Develop and execute test cases for both UI and API, with a focus on Fixed Income trading workflows. Drive the creation of regression test suites for critical back-office applications. Collaborate with development, business analysts, and project managers to ensure quality throughout the SDLC. Implement and utilize test management tools (e.g., X-Ray/JIRA). Provide clear and concise reporting on QA progress and metrics to management. Bring strong subject matter expertise in Financial Services Industry, particularly fixed income trading products and workflows. Basic knowledge on Python and its libraries like Pandas, NumPy, etc. Ensure effective, efficient, and continuous communication (written and verbally) with global stakeholders Independently troubleshoot difficult and complex issues on dev, test, UAT and production environments Responsible for end-to-end delivery of projects, coordination between client and internal offshore teams and manage client queries Demonstrate high attention to detail, should work in a dynamic environment whilst maintaining high quality standards, a natural aptitude to develop good internal working relationships and a flexible work ethic Responsible for Quality Checks and adhering to the agreed Service Level Agreement (SLA) / Turn Around Time (TAT)

Posted 6 hours ago

Apply

3.0 - 8.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Building models using best inclass ML technology. Training/fine tuning modelswith new/modified training dataset. Selecting features, buildingand optimizing classifiers using machine learning techniques Data mining usingstate-of-the-art methods Processing, cleansing, andverifying the integrity of data used for analysis Enhancing data collectionprocedures to include information that is relevant for building analyticsystems Technical skills Mandatory 3 to 8 years of experience as MachineLearning Researcher or Data Scientist Graduate in Engineering,Technology along with good business skills Good applied statistics skills,such as distributions, statistical testing, regression, etc. Excellent understanding ofmachine learning techniques and algorithms including knowledge about LLM Experience with NLP. Good scripting and programmingskills in Python Basic understanding of NoSQLdatabases, such as MongoDB, Cassandra Nice to have Exposure to financial researchdomain Experience with JIRA,Confluence Understanding of scrum andAgile methodologies Experience with datavisualization tools, such as Grafana, GGplot, etc. Soft skills Oral and written communicationskills Good problem solving andnegotiation skills Passion, curiosity andattention to detail

Posted 6 hours ago

Apply

5.0 - 8.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

The Lead MLOps Engineer will be responsible for leading technology initiatives aimed at improving business value and outcomes in the areas of digital marketing and commercial analytics through the adoption of Artificial Intelligence (AI) enabled solutions. Working with cross-functional teams across AI projects to operationalize data science models to deployed scalable solutions delivering business value. They should be inquisitive and bring an innovate mindset to work every day, researching, proposing, and implementing MLOps process improvements, solution ideas and ways of working to be more agile, lean and productive. Provide leadership and technical expertise in operationalizing machine learning models, bridging the gap between data science and IT operations. Key responsibilities include designing, implementing, and optimizing MLOps infrastructure, building CI/CD pipelines for ML models, and ensuring the security and scalability of ML systems. Key Responsibilities Architect Deploy: Design and manage scalable ML infrastructure on Azure (AKS), leveraging Infrastructure as Code principles. Automate Accelerate: Build and optimize CI/CD pipelines with GitHub Actions for seamless software, data, and model delivery. Engineer Performance: Develop efficient and reliable data pipelines using Python and distributed computing frameworks. Ensure Reliability: Implement solutions for deploying and maintaining ML models in production. Collaborate Innovate: Partner with data scientists and engineers to continuously enhance existing MLOps capabilities. Key Competencies Experience: A minimum of 5+ years of experience in software engineering, data science, or a related field with experience in MLOps is typically required. Education: A bachelor's or master's degree in Computer Science / Engineering. Soft Skills: Strong analytical and problem-solving skills, excellent communication and collaboration skills, and the ability to work in a fast-paced environment are highly valued. Azure AKS: Deep hands-on experience. IaC CI/CD: Mastery of Terraform/Bicep GitHub Actions. Data Engineering: Advanced Python Spark for complex pipelines. ML Operations: Proven ability in model serving monitoring. Problem Solver: Adept at navigating complex technical challenges and delivering solutions.

Posted 6 hours ago

Apply

2.0 - 4.0 years

4 - 7 Lacs

Gurugram

Work from Office

Naukri logo

Job Purpose We are seeking a Data Operations Engineer to improve the reliability and performance of data pipeline. Successful candidates will work with researchers, data strategists, operations and engineering teams to establish the smooth functioning of the pipeline sourced from an enormous and continuously updating catalog of vendor and market data. Essential Skills and Experience B.Tech/ M.Tech/ MCA with 1-3 years of overall experience Proficient with Python programming language like as well as common query languages like SQL Experience with the Unix platform and toolset such as bash, git, and regex. Excellent English communication: Oral and Writing Experience in the financial services industry or data science is a plus Critical thinking to dive into complex issues, identify root causes, and suggest or implement solutions A positive team focused attitude and work ethic Key Responsibilities Support the daily operation and monitoring of the pipeline Triage issues in timely manner, monitor real time alerts while also servicing research driven workflows Improve the reliability and operability of pipeline components Solve both business and technical problems around data structure, quality, and availability Interact with external vendors on behalf of internal clients. Demonstrate high attention to detail, should work in a dynamic environment whilst maintaining high quality standards Key Metrics Core Python, Linux Good to have Perl or C++ PromQL Behavioral Competencies Good communication (verbal and written) Experience in managing client stakeholders

Posted 6 hours ago

Apply

0.0 - 2.0 years

1 - 4 Lacs

Pune

Work from Office

Naukri logo

Were looking for a passionate AI/ML Engineer with 02 years of experience whos eager to dive deep into document processing, computer vision, and real-world AI problems. This role is ideal for someone who wants to grow with a startup and experience the complete journey of building and scaling intelligent systems. Key Responsibilities Assist in selecting, adapting, and implementing AI models for document parsing, table detection, layout analysis, and OCR enhancement. Fine-tune and optimize models such as Detectron2, LayoutLM, Donut, YOLO , or similar vision transformers. Apply model compression techniques like quantization , pruning , and ONNX for faster deployment. Contribute to integrating AI pipelines into production-grade backend systems. Collaborate with product and engineering teams to continuously improve output quality. Explore recent research in document intelligence and suggest innovations. Required Skills Proficiency in Python and libraries such as PyTorch or TensorFlow. Understanding of AI model training and evaluation workflows. Hands-on with at least one of: object detection models (YOLO, Detectron2), layout models (LayoutLM, Donut), or image preprocessing. Strong analytical skills and a hunger to learn. Preferred Qualities Prior internship or personal projects in Computer Vision or Document AI. Familiarity with OCR tools like Tesseract, EasyOCR, or Azure/GCP Vision APIs. Exposure to model deployment (ONNX/TensorRT, Docker, API development). Preference will be given to candidates who are enthusiastic about working in a startup and want to experience the journey of building and scaling AI products from the ground up. Why Join Us? Be part of a fast-growing AI startup solving meaningful real-world problems. Work directly with founders, senior engineers, and real customers. Fast-paced learning environment with end-to-end ownership. Flexible work hours and open culture.

Posted 6 hours ago

Apply

7.0 years

0 Lacs

Greater Chennai Area

Remote

Linkedin logo

Genesys empowers organizations of all sizes to improve loyalty and business outcomes by creating the best experiences for their customers and employees. Through Genesys Cloud, the AI-powered Experience Orchestration platform, organizations can accelerate growth by delivering empathetic, personalized experiences at scale to drive customer loyalty, workforce engagement, efficiency and operational improvements. We employ more than 6,000 people across the globe who embrace empathy and cultivate collaboration to succeed. And, while we offer great benefits and perks like larger tech companies, our employees have the independence to make a larger impact on the company and take ownership of their work. Join the team and create the future of customer experience together. Job Summary The Genesys Data & Analytics Team The Data & Analytics team is a central team comprised of Data Engineering, Data Platform/Technologies, Data Analytics, Data Science, Data Product, and Data Governance practices. This mighty team serves the enterprise that includes sales, finance, marketing, customer success, product and more. The team serves as a core conduit and partner to operational systems that run the business including Salesforce, Workday and more. The IT Manager of Analytics plays a pivotal role within the Enterprise Data & Analytics organization at Genesys. This role is responsible for leading a team of analysts and driving delivery of impactful analytics solutions that support enterprise functions including sales, finance, marketing, customer success, and product teams. This leader will oversee day-to-day analytics operations, coach and mentor a team of analysts, and collaborate closely with stakeholders to ensure alignment of analytics deliverables with business goals. The ideal candidate brings hands-on analytics expertise, a passion for data storytelling, and a track record of managing successful analytics teams. This position offers flexible work arrangements and may be structured as either hybrid or fully remote Responsibilities Lead and mentor a team of analytics professionals, fostering a collaborative and high-performing culture. Promote & drive best practices in analytics, data visualization, automation, governance, and documentation. Translate business needs into actionable data insights through dashboards, visualizations, and storytelling. Partner with enterprise functions to understand goals, define key metrics, and deliver analytics solutions that inform decision-making. Manage and prioritize the team’s project backlog, ensuring timely and quality delivery of analytics products. Collaborate with data engineering and platform teams to ensure scalable and reliable data pipelines and sources. Contribute to the development and maintenance of a shared analytics framework and reusable assets. Advocate for self-service analytics and data literacy across the business. Ensure compliance with data privacy, governance, and security policies. Requirements 7+ years relevant experience with Bachelor's / Master's degree in a natural science (computer science, data science, math, statistics, physics. etc.) Proven ability to lead and inspire analytics teams, delivering results in a fast-paced, cross-functional environment. Strong proficiency in BI and visualization tools (e.g., Looker, Tableau, QuickSight, Power BI). Solid understanding of cloud data platforms and big data ecosystems (e.g., AWS, Snowflake, Databricks). Strong business acumen and the ability to communicate technical concepts clearly to non-technical stakeholders. Experience building and managing stakeholder relationships across multiple departments. Adept at SQL and data modeling principles Experience with statistical scripting languages (Python preferred) Familiarity with Agile methodologies and project management tools (e.g., Jira, Confluence). Demonstrates a results-oriented mindset, take thoughtful risks, and approach challenges with humility and a hands-on, resourceful attitude Preferred Qualifications Creative, innovative and solution design thinking: You evaluate things holistically and think through the objectives, impacts, best practices, and what will be simple and scalable Excellent critical thinking, problem solving and analytical skills with a keen attention to detail. Skilled at running cross-functional relationships and communicating with leadership across multiple organizations Strong team player: ability to lead peers in accomplishment of common goals. If a Genesys employee referred you, please use the link they sent you to apply. About Genesys: Genesys empowers more than 8,000 organizations in over 100 countries to improve loyalty and business outcomes by creating the best experiences for their customers and employees. Through Genesys Cloud, the AI-powered Experience Orchestration platform, Genesys delivers the future of CX to organizations of all sizes so they can provide empathetic, personalized experience at scale. As the trusted platform that is born in the cloud, Genesys Cloud helps organizations accelerate growth by enabling them to differentiate with the right customer experience at the right time, while driving stronger workforce engagement, efficiency and operational improvements. Visit www.genesys.com. Reasonable Accommodations: If you require a reasonable accommodation to complete any part of the application process or are limited in the ability or unable to access or use this online application process and need an alternative method for applying, you or someone you know may reach out to HR@genesys.com. You can expect a response from someone within 24-48 hours. To ensure we set you up with the best reasonable accommodation, please provide them the following information: first and last name, country of residence, the job ID(s) or (titles) of the positions you would like to apply, and the specific reasonable accommodation(s) or modification(s) you are requesting. This email is designed to assist job seekers who seek reasonable accommodation for the application process. Messages sent for non-accommodation-related issues, such as following up on an application or submitting a resume, may not receive a response. Genesys is an equal opportunity employer committed to fairness in the workplace. We evaluate qualified applicants without regard to race, color, age, religion, sex, sexual orientation, gender identity or expression, marital status, domestic partner status, national origin, genetics, disability, military and veteran status, and other protected characteristics. Please note that recruiters will never ask for sensitive personal or financial information during the application phase. Show more Show less

Posted 6 hours ago

Apply

5.0 - 10.0 years

10 - 15 Lacs

Pune

Work from Office

Naukri logo

Job Overview: The ideal candidate will have strong Python programming skills and experience with web scraping frameworks and libraries like Requests, BeautifulSoup, Selenium, Playwright or URLlib. You will be responsible for building efficient and scalable web scrapers, extracting valuable data, and ensuring data integrity. This role requires a keen eye for problem-solving, the ability to work with complex data structures, and a strong understanding of web technologies like HTML, CSS, DOM, XPATH, and Regular Expressions. Knowledge of JavaScript would be an added advantage. Responsibilities: • As a Web Scraper, your role is to apply your knowledge set to fetch data from multiple • online sources • Developing highly reliable web Scraper and parsers across various websites • Extract structured/unstructured data and store them into SQL/No SQL data store • Work closely with Project/Business/Research teams to provide scrapped data for analysis • Maintain the scraping projects delivered to production • Develop frameworks for automating and maintaining constant flow of data from multiple • sources • Work independently with minimum supervision • Develop a deep understanding of the data sources on the web and know exactly how, when, and which data to scrap, parse and store this data Required Skills and Experience: • Experience as Web Scraper of 3 to 5 years. • Proficient knowledge in Python language and working knowledge of Web Crawling/Web scraping in Python Requests, Beautifulsoup or URLlib and Selenium, Playwright. • Must possess strong knowledge of basic Linux commands for system navigation, management, and troubleshooting. • Must have expertise in proxy usage to ensure secure and efficient network operations. • Must have experience with captcha-solving techniques for seamless automation and data extraction. • Experience with data parsing - Strong knowledge of Regular expression, HTML, CSS, DOM, XPATH. Knowledge of Javascript would be a plus • Must be able to access, manipulate, and transform data from a variety of database and flat file sources. MongoDB & MYSQL skills are essential. • Must possess strong knowledge of basic Linux commands for system navigation, management, and troubleshooting. • Must be able to develop reusable code-based scraping products which can be used by others. • GIT knowledge is mandatory for version control and collaborative development workflows. • Must have experience handling cloud servers on platforms like AWS, GCP, and LEAPSWITCH for scalable and reliable infrastructure management. • Ability to ask the right questions and deliver the right results in a way that is understandable and usable to your clients. • A track record of digging in to the tough problems, attacking them from different angles, and bringing innovative approaches to bear is highly desirable. Must be capable of selfteaching new techniques. Behavioural expectations: • Be excited by and have positive outlook to navigate ambiguity • Passion for results and excellence • Team player • Must be able to get the job done by working collaboratively with others • Be inquisitive and an analytical mind; out-of-the-box thinking • Prioritize among competing opportunities, balance consumer needs with business and product priorities, and clearly articulate the rationale behind product decisions • Straightforward and professional • Good communicator • Maintain high energy and motivate • A do-it-yourself orientation, consistent with the companys roll-up the- sleeves culture • Proactive

Posted 6 hours ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

L&T Technology is looking to hire for Design Verification Engineers. Job Location : Bangalore Detailed JD is below :: Job Description DV Positions: Define and implement IP/SoC verification plans, build verification test benches to enable IP/sub-stem/SoC level verification Develop functional tests based on verification test plan Drive Design Verification to closure based on defined verification metrics on test plan, functional and code coverage Debug, root-cause and resolve functional failures in the design, partnering with the Design team Qualifications and Skills for DV Positions: Bachelor's or Masters degree in Computer Science, Electronics Engineering or equivalent practical experience 8/10+ of hands-on experience in StemVerilog/UVM methodology and/or C/C++ based verification 8/ 10+ experience in IP/sub-stem and/or SoC level verification based on StemVerilog UVM/OVM based methodologies Experience in development of UVM based verification environments from scratch Experience in architecting and implementing Design Verification infrastructure and executing the full verification cycle Experience with verification of ARM/RISC-V based CPU sub-stems or SoCs Experience with IP or integration verification along with expertise of protocols like AMBA, PCIe, DDR, USB, Ethernet Experience in E tools and scripting (Python, TCL, Perl, Shell) used to build tools and flows for verification environments Experience with revision control stems like Mercurial(Hg), Git or SVNJob Description DV Positions: Define and implement IP/SoC verification plans, build verification test benches to enable IP/sub-stem/SoC level verification Develop functional tests based on verification test plan Drive Design Verification to closure based on defined verification metrics on test plan, functional and code coverage Debug, root-cause and resolve functional failures in the design, partnering with the Design team Show more Show less

Posted 6 hours ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Job Description DV Positions: Define and implement IP/SoC verification plans, build verification test benches to enable IP/sub-stem/SoC level verification Develop functional tests based on verification test plan Drive Design Verification to closure based on defined verification metrics on test plan, functional and code coverage Debug, root-cause and resolve functional failures in the design, partnering with the Design team Qualifications and Skills for DV Positions: Bachelor's or Masters degree in Computer Science, Electronics Engineering or equivalent practical experience 8/10+ of hands-on experience in StemVerilog/UVM methodology and/or C/C++ based verification 8/ 10+ experience in IP/sub-stem and/or SoC level verification based on StemVerilog UVM/OVM based methodologies Experience in development of UVM based verification environments from scratch Experience in architecting and implementing Design Verification infrastructure and executing the full verification cycle Experience with verification of ARM/RISC-V based CPU sub-stems or SoCs Experience with IP or integration verification along with expertise of protocols like AMBA, PCIe, DDR, USB, Ethernet Experience in E tools and scripting (Python, TCL, Perl, Shell) used to build tools and flows for verification environments Experience with revision control stems like Mercurial(Hg), Git or SVN Show more Show less

Posted 6 hours ago

Apply

9.0 - 14.0 years

35 - 55 Lacs

Noida

Hybrid

Naukri logo

Looking For A Better Opportunity? Join Us and Make Things Happen with DMI a Encora company now....! Encora is seeking a full-time Lead Data Engineer with Logistic domian expertise to support our manufacturing large scale client in digital transformation. The Lead Data Engineer is responsible for ensuring the day-to-day leadership and guidance of the local, India-based, data team. This role will be the primary interface with the management team of the client and will work cross functionally with various IT functions to streamline project delivery. Minimum Requirements: l 8+ years of experience overall in IT l Current - 5+ years of experience on Azure Cloud as Data Engineer l Current - 3+ years of hands-on experience on Databricks/ AzureDatabricks l Proficient in Python/PySpark l Proficient in SQL/TSQL l Proficient in Data Warehousing Concepts (ETL/ELT, Data Vault Modelling, Dimensional Modelling, SCD, CDC) Primary Skills: Azure Cloud, Databricks, Azure Data Factory, Azure Synapse Analytics, SQL/TSQL, PySpark, Python + Logistic domain expertise Work Location: Noida, India (Candidates who are open for relocation on immediate basis can also apply) Interested candidates can apply at nidhi.dubey@encora.com along with their updated resume: 1. Total experience: 2.Relevant experience in Azure Cloud: 3. Relevant experience in Azure Databricks: 4. Relevant experience in Azure Syanspse: 5. Relevant experience in SQL/T-SQL: 6. Relevant experience in Pyspark: 7. Relevant experience in python: 8. Relevant experience in logistic domain: 9. Relevant experience in data warehosuing: 10. Current CTC: 11. Expected CTC: 12. Official Notice Period. if serving please specify LWD:

Posted 6 hours ago

Apply

2.0 years

0 Lacs

Gurugram, Haryana, India

Remote

Linkedin logo

Experience : 2.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: GCP, support, Python Forbes Advisor is Looking for: Role Summary We are seeking a proactive and detail-oriented Data Support Engineer- to monitor production processes, manage incident tickets, and ensure seamless operations in our data platforms. The ideal candidate will have experience in Google Cloud Platform (GCP), Airflow, Python and SQL with a strong focus on enabling developer productivity and maintaining system reliability. Key Responsibilities: Production Monitoring: Monitor and ensure the smooth execution of production data pipelines and workflows. Identify and promptly address anomalies or failures in the production environment. Perform first-level investigation for issues, leveraging logs and monitoring tools. Incident Management: Create and manage tickets for identified production issues, ensuring accurate documentation of details and impact analysis. Assign tickets to the appropriate development teams and follow up to ensure timely resolution. Communication of incidents within the Data Team. Platform Support: Participate in daily standup and team meetings and contribute to platform improvement initiatives. Contribute to enhancing the platform to streamline development workflows and improve system usability. Required Skills: Bachelor’s degree with Minimum 1 year of experience working in supporting the production pipelines. Proficiency in SQL for debugging tasks. Familiarity with incident management tools like JIRA. Strong communication skills to interact with cross-functional teams and stakeholders. Good to have: Hands-on experience with Google Cloud Platform (GCP) services like BigQuery. Strong understanding of Apache Airflow and managing DAGs. Basic understanding of DevOps practices and automating CI/CD pipelines. Python Proficiency Note: This role requires candidates to work in UK timings. Saturday and Sunday will be working. Rotational off will be provided. Qualifications Bachelors degree in full time. How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 6 hours ago

Apply

2.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

We are looking for a software Engineer who will be responsible for working on projects that are currently being developed in our company. You will also work closely with clients and cross-functional departments to communicate project status. Location: Gurgaon Salary: As per industry standards Experience: 2-5 years Job Description Responsible for developing, designing, writing, modifying, and debugging software based on specific requirements. Core Programming Language: Languages can evolve programming languages, including C++, C #, Java, Perl, Python, and Good knowledge of Linux. Hands-on experience with professional software engineering practices & software development life cycle, including coding standards, code reviews, source control management, and build processes. Analyzing data to effectively coordinate the installation of new systems or the modification of existing systems. Writing Issues, Epics, and tasks to enable effective software development under an agile methodology Developing code and creating a pull request (Git) Reviewing, commenting, and supporting other developers with the team Skills and Qualifications Automation of processes from continuous development. Ready to solve complex architectural challenges. Designing and coding Strong knowledge of web frameworks (Flask, Django, Fast API) Fixing bugs and overseeing the development of documents. Must have worked on developing an IVR system Hands-on experience in Perl, Python, Rest API, Mysqul, Redis, MongoDB, Shell, Linux, Docker, Git Excellent communication skills, both written and verbal Educational Requirements: 🎓 UG: B.Tech (CS & IT) - Computers, Other Specialization 🎓 PG: M.Tech - Computers, MCA - Computers, MS If you're passionate about software engineering, thrive in a collaborative environment, and have experience with Dialer system development, we want to hear from you! Show more Show less

Posted 6 hours ago

Apply

2.5 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description This is DevOps role with Power Platform We are looking for a Power Platform Developer to join our team. In this role, you will be responsible for designing, developing, and implementing solutions using the Microsoft Power Platform suite, including Power Apps, Power Automate, and Power BI. The ideal candidate should have a strong background in software development, experience with the Microsoft Power Platform, and the ability to create solutions that improve business processes and decision-making. Ideal candidate should be: Total experience should be 2.5+ years Develop and maintain applications using Microsoft Power Apps. Automate business processes and workflows using Microsoft Power Automate. Create data visualization and reporting solutions using Power BI. Working experience with Copilot Studio to build Copilots Collaborate with business stakeholders to gather requirements and deliver custom solutions. Integrate Power Platform solutions with other Microsoft services and third-party applications. Ensure solutions are scalable, maintainable, and secure. Provide technical support and training to users. Stay updated with the latest features and updates in the Microsoft Power Platform. Ability to understand the business process and create process flow diagrams when required. Analyse business problems, design technology solutions to these problems and manage the solution through the full life cycle. Works in collaboration with projects teams to develop, test, Integrate, implement, and support solutions using RPA platforms. Responsible for ensuring an efficient integration of programming deliverables, timely builds, and overall code quality. Good communication and analytical skills with troubleshooting the issue on own Candidate should be ready to work in weekend and night shifts. Power Platform Certification is added advantage Knowledge of Python, C# is added advantage EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 6 hours ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Job Description Summary As a Software Engineer, You will be Responsible for designing, building, delivering and maintaining software applications & services. Working in the areas of machine, cloud, platform and/or application. Responsible for software lifecycle including activities such as requirement analysis, documentation/procedures and implementation. GE Healthcare is a leading global medical technology and digital solutions innovator. Our mission is to improve lives in the moments that matter. Unlock your ambition, turn ideas into world-changing realities, and join an organization where every voice makes a difference, and every difference builds a healthier world. Job Description Roles and Responsibilities In This Role, You Will Apply principles of SDLC and methodologies like Lean/Agile/XP, CI, Software and Product Security, Scalability, Documentation Practices, refactoring and Testing Techniques Develop and maintain software applications primarily using C++. Collaborate with system engineers, frontend developers and software developers to implement solutions that are aligned with and extend shared platforms and solutions Compile and build applications on both Linux and Windows systems. Design and implement low-level software components with a strong understanding of design patterns. Break down system designs into class and flow diagrams. Deliver high-quality code with comprehensive unit and automation tests. Collaborate with cross-functional teams to define, design, and ship new features. Troubleshoot, debug, and optimize existing software applications. Understand performance parameters and assess application performance Education Qualification Bachelor's Degree in Computer Science or “STEM” Majors (Science, Technology, Engineering and Math) with a minimum of 3+ years of experience in application development with C++. Desired Characteristics Technical Expertise In-depth knowledge of the latest C++ standards (C++11, C++14, C++17, C++20). Familiarity with commonly used C++ libraries such as Boost, STL (Standard Template Library), Qt. Excellent understanding of build methodologies for C++ code for Linux and Windows systems using CMake, Make, and Visual Studio. Experience in writing unit, component, and integration tests using GTest, CppUnit Knowledge of SQL and NoSQL Databases, along with connecting to the databases from C++ applications Additional knowledge of Python including writing clean, efficient, and maintainable Python code with knowledge of basic Python libraries Preferred Qualifications Experience with version control systems, particularly Gitlab Familiarity with Agile development methodologies. Knowledge of continuous integration and continuous deployment (CI/CD) pipelines. Knowledge of Cryptography and Certificate Management is a plus. Business Acumen Has the ability to break down problems and estimate time for development tasks. Understands the technology landscape, up to date on current technology trends and new technology, brings new ideas to the team. Displays understanding of the project's value proposition for the customer. Shows commitment to deliver the best value proposition for the targeted customer. Learns organization vision statement and decision making framework. Able to understand how team and personal goals/objectives contribute to the organization vision Personal/Leadership Attributes Voices opinions and presents clear rationale. Uses data or factual evidence to influence. Learns organization vision statement and decision making framework. Able to understand how team and personal goals/objectives contribute to the organization vision. Completes assigned tasks on time and with high quality. Takes independent responsibility for assigned deliverables. Has the ability to break down problems and estimate time for development tasks. Seeks to understand problems thoroughly before implementing solutions. Asks questions to clarify requirements when ambiguities are present. Identifies opportunities for innovation and offers new ideas. Takes the initiative to experiment with new software frameworks Adapts to new environments and changing requirements. Pivots quickly as needed. When coached, responds to need & seeks info from other sources. Write code that meets standards and delivers desired functionality using the technology selected for the project Inclusion and Diversity GE Healthcare is an Equal Opportunity Employer where inclusion matters. Employment decisions are made without regard to race, color, religion, national or ethnic origin, sex, sexual orientation, gender identity or expression, age, disability, protected veteran status or other characteristics protected by law. We expect all employees to live and breathe our behaviors: to act with humility and build trust; lead with transparency; deliver with focus, and drive ownership – always with unyielding integrity. Our total rewards are designed to unlock your ambition by giving you the boost and flexibility you need to turn your ideas into world-changing realities. Our salary and benefits are everything you’d expect from an organization with global strength and scale, and you’ll be surrounded by career opportunities in a culture that fosters care, collaboration and support. Additional Information Relocation Assistance Provided: Yes Show more Show less

Posted 6 hours ago

Apply

5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Greetings from TCS! Role : Data Engineer Required Technical Skill Set: Python , Sql , Pyspark Desired Experience Range :5+ years Must-Have · Actively engage in the development, testing, deployment, operation, monitoring and refinement of data services. · Manage incidents/problems, apply fixes and resolve systematic issues; triage issues with stakeholders and identify and implement solutions to restore productivity. · Experience with design build and implementation experience on Data Engineering pipelines using SQL, Python , Databricks (or Snowflake) · Experience with data solutions in Cloud (optional: preferably with AWS) as well as on-premises assets like Oracle. · Experience of Pyspark is Desirable. · Good experience with Stored Procs. Responsibility of / Expectations from the Role Deliver ETL data pipelines for business requirements Orchestrate data pipelines to ensure timely delivery of datasets Translate business to technical requirements. Regards Monisha Show more Show less

Posted 6 hours ago

Apply

3.0 years

0 Lacs

Sahibzada Ajit Singh Nagar, Punjab, India

On-site

Linkedin logo

Job Title: MERN stack Developer Location: Mohali, India (On-site) Experience Required: Minimum 3 Years Job Summary: We are seeking a highly skilled Software Engineer with hands-on experience in the MERN stack (MongoDB, Express.js, React.js, Node.js) and strong scripting and automation capabilities using Python. The ideal candidate should be comfortable working on server deployments (VPS), automation tasks, and have a solid understanding of cloud services and DevOps tools. Key Responsibilities: Develop and maintain scalable web applications using the MERN stack. Write and maintain scripts for automation tasks using Python and relevant frameworks. Manage server deployments on VPS environments, ensuring performance, uptime, and security. Work with Git for version control and collaborative development. Collaborate with the team to build, test, and deploy new features quickly and efficiently. Monitor and improve backend performance. Bonus: Contribute to cloud integration and containerization using Docker, Azure, or Kubernetes. Required Skills: Strong expertise in Node.js and backend logic. Solid experience with MongoDB, Express.js Proficient in Python and automation frameworks/libraries (e.g., Selenium, Requests, Scrapy, Django, etc.). Experience with VPS setup, server monitoring, and configuration. Good understanding of Git and working with version control systems. Familiarity with REST APIs and webhooks. Preferred/Bonus Skills: Knowledge of Docker, Azure, Kubernetes, or other cloud technologies. Experience with CI/CD pipelines. Basic Linux server management and shell scripting Show more Show less

Posted 6 hours ago

Apply

Exploring Python Jobs in India

Python has become one of the most popular programming languages in India, with a high demand for skilled professionals across various industries. Job seekers in India have a plethora of opportunities in the field of Python development. Let's delve into the key aspects of the Python job market in India:

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Chennai
  5. Mumbai

Average Salary Range

The average salary range for Python professionals in India varies based on experience levels. Entry-level positions can expect a salary between INR 3-6 lakhs per annum, while experienced professionals can earn between INR 8-20 lakhs per annum.

Career Path

In the field of Python development, a typical career path may include roles such as Junior Developer, Developer, Senior Developer, Team Lead, and eventually progressing to roles like Tech Lead or Architect.

Related Skills

In addition to Python proficiency, employers often expect professionals to have skills in areas such as: - Data Structures and Algorithms - Object-Oriented Programming - Web Development frameworks (e.g., Django, Flask) - Database management (e.g., SQL, NoSQL) - Version control systems (e.g., Git)

Interview Questions

  • What is the difference between list and tuple in Python? (basic)
  • Explain the concept of list comprehensions in Python. (basic)
  • What are decorators in Python? (medium)
  • How does memory management work in Python? (medium)
  • Differentiate between __str__ and __repr__ methods in Python. (medium)
  • Explain the Global Interpreter Lock (GIL) in Python. (advanced)
  • How can you handle exceptions in Python? (basic)
  • What is the purpose of the __init__ method in Python? (basic)
  • What is a lambda function in Python? (basic)
  • Explain the use of generators in Python. (medium)
  • What are the different data types available in Python? (basic)
  • Write a Python code to reverse a string. (basic)
  • How would you implement multithreading in Python? (medium)
  • Explain the concept of PEP 8 in Python. (basic)
  • What is the difference between append() and extend() methods in Python lists? (basic)
  • How do you handle circular references in Python? (medium)
  • Explain the use of virtual environments in Python. (basic)
  • Write a Python code to find the factorial of a number using recursion. (medium)
  • What is the purpose of __name__ variable in Python? (medium)
  • How can you create a virtual environment in Python? (basic)
  • Explain the concept of pickling and unpickling in Python. (medium)
  • What is the purpose of the pass statement in Python? (basic)
  • How do you debug a Python program? (medium)
  • Explain the concept of namespaces in Python. (medium)
  • What are the different ways to handle file input and output operations in Python? (medium)

Closing Remark

As you explore Python job opportunities in India, remember to brush up on your skills, prepare for interviews diligently, and apply confidently. The demand for Python professionals is on the rise, and this could be your stepping stone to a rewarding career in the tech industry. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies