Home
Jobs

62804 Python Jobs - Page 26

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 12.0 years

13 - 20 Lacs

Prayagraj, Allahabad

Work from Office

Naukri logo

Key Responsibilities: OSI PI System Development: Design, configure, and implement OSI PI System solutions, including PI Asset Framework (AF), PI Data Archive, PI Vision, and PI Integrators. Asset Framework (AF) Modeling: Develop hierarchical asset models, templates, and calculations to standardize data across industrial operations. Real-time Data Integration: Work with SCADA, DCS, PLCs, and IoT systems to integrate real-time and historical data into OSI PI. Scripting & Automation: Develop scripts using PowerShell, Python, or PI SDK (AF SDK, PI Web API, or PI SQL DAS) to automate data processes.

Posted 6 hours ago

Apply

10.0 - 17.0 years

16 - 25 Lacs

Hyderabad

Work from Office

Naukri logo

Hiring for Modeling Engineer with MNC Company at Bengaluru Experience:10+ Years Location:Hyderabad Perform Power System Stability (PSS) studies, load flow, short circuit, and transient stability analyses. Develop and validate dynamic models for generators, governor-turbine systems, AVRs, and other control systems. Conduct system impact studies, grid compliance assessments, and interconnection studies. Build and maintain detailed power system models using ETAP, Power Factory, PSCAD, and PSSE. Interpret and apply grid codes and standards (e.g., ENTSO-E, NERC, National Grid). Collaborate with cross-functional teams including electrical, controls, and system engineers. Prepare technical reports, documentation, and presentations for internal, customer and compliances. Support project delivery from feasibility through to commissioning. Required Qualifications: Proven experience (typically 10+ years) in power system modelling and simulation. Proficiency in DIgSILENT PowerFactory, PSCAD, ETAP and PSSE. Strong understanding of power system dynamics, control systems, and grid integration. Experience with governor, turbine, AVR, and generator model development. Familiarity with scripting or automation in Python, MATLAB, or similar tools is a plus. Please drop your CV to sindhura@delighthr.com

Posted 6 hours ago

Apply

20.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Responsibilities: Leading the program end-to-end from architecture to tapeout. Work with systems teams to define architecture and design specifications. Work with Design teams to define micro-architecture/ design implementation (module level SOC level). Ensure compliance to the company s design process/ design flows. Ensure design maturity at intermediate milestones, maintaining schedule cognizance, and adhering to defined quality metrics throughout the development process. Overseeing the development of RTL code and ensuring its correctness. Reviewing the verification process verification scope/plan, including functional, coverage-driven, and power-aware verification. Reviewing the Emulation plan/ scope to help define Tape-out gating items Coordinating with DFT Physical design teams to ensure successful implementation and tape-out. Conducting reviews for performance tests to evaluate system stability, scalability, and reliability. Addressing the complexity of multi-die solutions, including inter-die communication, power management, and thermal challenges. Designing and implementing chips targeting datacenter applications, focusing on high performance, low power consumption, and scalability. Collaborating with cross-functional teams across sites for smooth execution of the program, including post-Si debugs Providing technical guidance and mentorship to junior engineers. Ensuring compliance with industry standards and best practices. What Were Looking For Bachelors or Masters degree in Electrical or Computer Science Engineering or a related field. 20+ years of technical experience with 5+ tapeouts as a Chip lead. Proven experience in leading RTL2GDSII programs. Strong background in RTL design, verification methodologies, and physical design. Proficiency in scripting languages (PERL/Python) and EDA tools. Solid understanding of system and processor architecture, advanced memory, and IO technologies. Experience with multi-die solutions and addressing their associated complexities. Knowledge of designing chips for datacenter applications, including performance optimization and power management. Excellent self-motivation, communication, problem-solving, and teamwork skills. Show more Show less

Posted 6 hours ago

Apply

3.0 - 6.0 years

25 - 40 Lacs

Pune

Work from Office

Naukri logo

SDE 2/3 - Backend Experience: 3 - 6 Years Exp Salary : Competitive Preferred Notice Period : Within 30 Days Shift : 10:00AM to 7:00PM IST Opportunity Type: Onsite (Pune) Placement Type: Permanent (*Note: This is a requirement for one of Uplers' Clients) Must have skills required : Python OR Node.js The Souled Store(One of Uplers' Clients) is Looking for: SDE 2/3 - Backend who is passionate about their work, eager to learn and grow, and who is committed to delivering exceptional results. If you are a team player, with a positive attitude and a desire to make a difference, then we want to hear from you. Role Overview Description Job Description: Backend Developer Location: Pune About Us: The Souled Store is one of Indias fastest-growing youth-focused lifestyle brands. Founded in 2013 as a homegrown Indian brand, we have become one of the largest online merchandising platforms, partnering with global licenses such as Disney, Warner Bros., WWE, IPL, and Viacom18. Our offerings extend beyond themed designs like superheroes, movies, TV shows, and cartoons, as we continue to lead the latest youth fashion and lifestyle trends. Expanding beyond apparel, we are growing across categories such as action figures, backpacks, collectibles, footwear, and kidswear. While primarily an online brand, we have successfully ventured into offline retail with 40 stores across India and ambitious plans for further expansion. At The Souled Store, we believe in loving what you do, working smart, and taking ownership. We have built a strong, dynamic team of like-minded individuals who think like leaders and drive growth aggressively. If you resonate with our vision and want to be part of an evolving brand, we would love to have you on board. Position Overview: We are looking for a highly skilled Senior Backend Developer to join our team. The ideal candidate will bring extensive expertise in backend systems, cloud-native applications, and microservices, along with a strong track record of building scalable systems. If you are passionate about developing robust architectures and driving technical innovation, wed love to hear from you. Responsibilities: Design, develop, and maintain backend systems and cloud-native applications. Architect and implement scalable microservices using Go, Node.js, or Spring Boot. Leverage AWS cloud services to build, deploy, and monitor applications. Optimise systems for high availability, scalability, and performance. Work with Kafka, Redis, and Spark to manage real-time data pipelines and caching mechanisms. Design database solutions using MySQL and NoSQL technologies for efficient data storage and retrieval. Collaborate with cross-functional teams to integrate payment gateways and ensure seamless transaction processing (experience desirable). Contribute to the architectural design of systems to meet eCommerce and high-scale system demands. Write and maintain clean, reusable code with Python (desirable but not mandatory). Drive best practices for CI/CD pipelines and automated deployments. Mentor junior engineers and actively contribute to the team’s technical growth. Required Qualifications: 3-6 years of experience in software engineering, with a focus on backend development and microservices architecture. Proficiency in one or more of the following: Go, Node.js, Python or Spring Boot. Deep understanding of AWS services (e.g., S3, RDS, Lambda, EC2). Proven experience in designing systems for scale and high performance. Hands-on experience with Kafka, Redis, Spark, and other distributed technologies. Strong knowledge of MySQL and NoSQL databases. Experience with system architecture design and implementation. Familiarity with e-commerce platforms is highly desirable. Experience with payment gateway integration is a plus. Strong problem-solving skills and the ability to work in fast-paced environments. How to apply for this opportunity: Easy 3-Step Process: 1. Click On Apply! And Register or log in on our portal 2. Upload updated Resume & Complete the Screening Form 3. Increase your chances to get shortlisted & meet the client for the Interview! About Our Client: In 2013, a group of passionate individuals set out to build something different - a brand that wasn’t just about fashion but about self-expression, individuality, and a love for pop culture. That’s how The Souled Store was born. Today, we’re one of India’s leading homegrown brands, offering everything from licensed pop culture merch (Marvel, Disney, DC and wayy more) to high-quality fashion. At The Souled Store, we believe that loving what you do is the key to doing it well. Our team is driven by creativity, innovation and a deep passion for building a brand that resonates with India’s youth.. About Uplers: Our goal is to make hiring and getting hired reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant product and engineering job opportunities and progress in their career. (Note: There are many more opportunities apart from this on the portal.) So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 6 hours ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Description The JP Economics and Decision Science team is a central science team working across a variety of topics in the Retail business and beyond. We work closely with business leaders to drive change at Amazon. We focus on solving long-term, ambiguous and challenging problems, while providing advisory support to help solve short-term business pain points. Key topics include pricing, product selection, delivery speed, profitability, and customer experience. We tackle these issues by building novel economic/econometric models, machine learning systems, and high-impact experiments which we integrate into business, financial, and system-level decision making. We are looking for a Business Intelligence Engineer to work alongside our Scientists to build statistical and ML models, and working directly with senior leaders to tackle their most critical problems in Retail business. Key job responsibilities Drive business changes through data insights to drive change across a wide range of retail business inputs Explore/analyze data and work with Product Managers to understand customer behaviors, develop complex data pipelines Work closely with our scientists to design scalable model architectures. Identify appropriate metrics and feature sets for the problem, and build/maintain automated pipelines for their extraction Fostering culture of continuous engineering improvement through mentoring and feedback Basic Qualifications 5+ years of SQL experience Experience programming to extract, transform and clean large (multi-TB) data sets Experience with theory and practice of design of experiments and statistical analysis of results Experience with AWS technologies Experience in scripting for automation (e.g. Python) and advanced SQL skills. Experience with theory and practice of information retrieval, data science, machine learning and data mining 8+years of analytical experience Preferred Qualifications Experience working directly with business stakeholders to translate between data and business needs Experience managing, analyzing and communicating results to senior leadership Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - Karnataka - A66 Job ID: A2993715 Show more Show less

Posted 6 hours ago

Apply

0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

Required Skills Strong working knowledge of modern programming languages, ETL/Data Integration tools (preferably SnapLogic) and understanding of Cloud Concepts. SSL/TLS, SQL, REST, JDBC, JavaScript, JSON Has Strong hands-on experience in Snaplogic Design/Development. Has good working experience using various snaps for JDBC, SAP, Files, Rest, SOAP, etc. Good to have the ability to build complex mappings with JSON path expressions, and Python scripting. Good to have experience in ground plex and cloud plex integrations. Has Strong hands-on experience in Snaplogic Design/Development. Has good working experience using various snaps for JDBC, SAP, Files, Rest, SOAP, etc. Should be able to deliver the project by leading a team of the 6-8member team. Should have had experience in integration projects with heterogeneous landscapes. Good to have the ability to build complex mappings with JSON path expressions, flat files and cloud. Good to have experience in ground plex and cloud plex integrations. Experience in one or more RDBMS (Oracle, DB2, and SQL Server, PostgreSQL and RedShift) Real-time experience working in OLAP & OLTP database models (Dimensional models). Show more Show less

Posted 6 hours ago

Apply

1.0 - 3.0 years

6 - 12 Lacs

Pune

Work from Office

Naukri logo

Role & responsibilities Job Overview: We are looking for a highly motivated Junior Data Engineer with a passion for web scraping and web crawling to join our team. The ideal candidate will have strong Python programming skills and experience with web scraping frameworks and libraries like Requests, BeautifulSoup, Selenium, Playwright or URLlib. You will be responsible for building efficient and scalable web scrapers, extracting valuable data, and ensuring data integrity. This role requires a keen eye for problem-solving, the ability to work with complex data structures, and a strong understanding of web technologies like HTML, CSS, DOM, XPATH, and Regular Expressions. Knowledge of JavaScript would be an added advantage. Responsibilities: As a Web Scraper, your role is to apply your knowledge set to fetch data from multiple online sources Developing highly reliable web Scraper and parsers across various websites Extract structured/unstructured data and store them into SQL/No SQL data store Work closely with Project/Business/Research teams to provide scrapped data for analysis Maintain the scraping projects delivered to production Develop frameworks for automating and maintaining constant flow of data from multiple sources Work independently with minimum supervision Develop a deep understanding of the data sources on the web and know exactly how, when, and which data to scrap, parse and store this data Required Skills and Experience: Experience as Web Scraper of 1 to 2 years. Proficient knowledge in Python language and working knowledge of Web Crawling/Web scraping in Python Requests, Beautifulsoup or URLlib and Selenium, Playwright. Must possess strong knowledge of basic Linux commands for system navigation, management, and troubleshooting. Must have expertise in proxy usage to ensure secure and efficient network operations. Must have experience with captcha-solving techniques for seamless automation and data extraction. Experience with data parsing - Strong knowledge of Regular expression, HTML, CSS, DOM, XPATH. Knowledge of Javascript would be a plus Preferred candidate profile Must be able to access, manipulate, and transform data from a variety of database and flat file sources. MongoDB & MYSQL skills are essential. • Must possess strong knowledge of basic Linux commands for system navigation, management, and troubleshooting. • Must be able to develop reusable code-based scraping products which can be used by others. • GIT knowledge is mandatory for version control and collaborative development workflows. • Must have experience handling cloud servers on platforms like AWS, GCP, and LEAPSWITCH for scalable and reliable infrastructure management. • Ability to ask the right questions and deliver the right results in a way that is understandable and usable to your clients. • A track record of digging in to the tough problems, attacking them from different angles, and bringing innovative approaches to bear is highly desirable. Must be capable of selfteaching new techniques. Behavioural expectations: • Be excited by and have positive outlook to navigate ambiguity • Passion for results and excellence • Team player • Must be able to get the job done by working collaboratively with others • Be inquisitive and an analytical mind; out-of-the-box thinking • Prioritize among competing opportunities, balance consumer needs with business and product priorities, and clearly articulate the rationale behind product decisions • Straightforward and professional • Good communicator • Maintain high energy and motivate • A do-it-yourself orientation, consistent with the companys roll-up the- sleeves culture • Proactive

Posted 6 hours ago

Apply

3.0 - 8.0 years

8 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

The FICC Quant Developer should possess a robust understanding of Fixed Income, particularly in pricing calculations. The ideal candidate should be proficient in Python or other relevant programming languages and have experience in developing fixed income pricing models or calculations engines. Key Responsibilities Developing Fixed Income pricing/valuation and analytics models using statistical techniques Interacting with the trading and client senior technology team to analyze and understand their requirements Back testing models Working on implementing models on client calculation platforms; understanding of data quality nuances and ability to design rules around it Handle ad hoc requests for data analysis or building peripheral models. Experience Postgraduate in Economics/Financial Engineering/Maths/Physics, at least 8+ years of work exp. Proficient in econometrics, with prior experience in quantitative modeling specifically in Fixed Income Good understanding of Fixed income as an asset class, pricing and valuation and other analytics Coding skills: Python/C#/C++ etc. (advanced level) Having a good knowledge of databases preferably third-party providers like Bloomberg Advanced Excel/VBA and quantitative skills: Ability to work with and analyse large set of data and information. Excellent communication and interpersonal skills High level of independent thinking and approach Behavioral Competencies Good communication (verbal and written) Experience in managing client stakeholders

Posted 6 hours ago

Apply

4.0 - 9.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Were building the technological foundation for our companys Semantic Layera common data language powered by Anzo / Altair Graph Studio. As a Senior Software Engineer, youll play a critical role in setting up and managing this platform on AWS EKS, enabling scalable, secure, and governed access to knowledge graphs, parallel processing engines, and ontologies across multiple domains, including highly sensitive ones like clinical trials. Youll help design and implement a multi-tenant, cost-aware, access-controlled infrastructure that supports internal data product teams in securely building and using connected knowledge graphs. Key Responsibilities Implement a Semantic Layer for on Anzo / Altair Graph Studio and Anzo Graph Lakehouse in a Kubernetes or ECS environment (EKS / ECS) Develop and manage Infrastructure as Code using Terraform and configuration management via Ansible Integrate platform authentication and authorization with Microsoft Entra ID (Azure AD) Design and implement multi-tenant infrastructure patterns that ensure domain-level isolation and secure data access Build mechanisms for cost attribution and usage visibility per domain and use case team Implement fine-grained access control, data governance, and monitoring for domains with varying sensitivity (e.g., clinical trials) Automate deployment pipelines and environment provisioning for dev, test, and production environments Collaborate with platform architects, domain engineers, and data governance teams to curate and standardize ontologies Minimum Requirements 4 - 9 years of experience in Software / Platform Engineering, DevOps, or Cloud Infrastructure roles Proficiency in Python for automation, tooling, or API integration Hands-on experience with AWS EKS / ECS and associated services (IAM, S3, CloudWatch, etc.) Strong skills in Terraform / Ansible / IaC for infrastructure provisioning and configuration Familiarity with RBAC, OIDC, and Microsoft Entra ID integration for enterprise IAM Understanding of Kubernetes multi-tenancy and security best practices Experience building secure and scalable platforms supporting multiple teams or domains Preferred Qualifications Experience deploying or managing Anzo, Altair Graph Studio, or other knowledge graph / semantic layer tools Familiarity with RDF, SPARQL, or ontologies in an enterprise context Knowledge of data governance, metadata management, or compliance frameworks Exposure to cost management tools like AWS Cost Explorer / Kubecost or custom chargeback systems Why Join Us Be part of a cutting-edge initiative shaping enterprise-wide data access and semantics Work in a cross-functional, highly collaborative team focused on responsible innovation Influence the architecture and strategy of a foundational platform from the ground up

Posted 6 hours ago

Apply

5.0 - 10.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

As a Senior DevOps Engineer with expertise in Azure and AWS, you will play a pivotal role in driving the deployment, automation, and continuous improvement of cloud-based platforms supporting R&D efforts in drug discovery and development. You will bridge software development, IT operations, and quality assurance, ensuring a reliable, scalable, and automated infrastructure that accelerates deployment cycles and minimizes downtime. Key responsibilities include: Spearheading the deployment of new releases, ensuring seamless updates that enhance platform performance and capabilities. Managing and maintaining Azure AD (Entra ID) for identity and access management, including authentication flows for both internal and third-party services. Advocating and implementing DevOps best practices, including CI/CD, automation, and observability across our platforms. Collaborating with cross-functional teams to influence architectural decisions and align infrastructure with business goals. Ensuring compliance and security within a regulated environment (GxP preferred) by implementing role-based access control (RBAC), secrets management, and monitoring frameworks. You will join a multicultural, agile team with high autonomy in decision-making. This role requires on-site presence at our Bangalore location. Your Skills & Experience: Must-Have: 5+ years of experience in DevOps, cloud infrastructure, and automation. Strong expertise in Azure DevOps, including project configurations, repositories, pipelines, and environments. Proficiency in Azure AD (Entra ID), including app registration, authentication flows (OBO, Client Credentials), and access control. Experience with AWS services such as VPC, IAM, STS, EKS, RDS, EC2, ECS, Route53, CloudWatch, CloudTrail, Secrets Manager, S3, API Gateway, Lambda, and MWAA. Hands-on experience in containerization and Kubernetes, including Docker, Helm, and ArgoCD (GitOps). Strong scripting and automation skills in Python for workflow automation and integration. Proficiency in Infrastructure as Code (IaC) using Terraform and AWS CloudFormation. Experience with configuration management tools such as Ansible for automating service provisioning and management. Strong experience in observability, including logging, monitoring, and tracing with tools like Prometheus, Grafana, ELK Stack, or AWS-native solutions. Excellent understanding of security best practices in DevOps, including RBAC, secrets management, and compliance frameworks. Ability to work in multinational teams across US, Europe (primarily Germany), and India. Strong communication skills in English for both technical and non-technical stakeholders. Good to Have: Experience in a regulated environment (GxP, Pharma, or Life Sciences). Knowledge of Next.js, Storybook, Tailwind, TypeScript for front-end web development. Experience with PostgreSQL, OpenSearch/Elasticsearch for data management. Familiarity with R and SAS for data analysis and statistical modeling. Understanding of AWS billing practices and cost optimization tools. Why Join Us? Work in a high-impact role contributing to cutting-edge R&D in drug discovery and development. Be part of a multicultural, agile team with high autonomy in decision-making. Exposure to a diverse tech stack combining Azure, AWS, Kubernetes, Python, and CI/CD tools. Opportunities for career growth and skill development in cloud computing, security, and automation. If you are curious, ambitious, and eager to take on this challenge, apply today and be at the forefront of shaping the future of Data & Analytics in Healthcare IT Experience required: Your role: you will be responsible for IT Systems in Target for Incident, change and release management. Communicating with customers service providers and other stakeholders. Responsible for the delivery of system and support services and ensuring that they meet business requirements. Perform System Management Tasks Employee is expected to act independently to ensure aspects of meeting quality levels on system user satisfaction, system owner interactions/stakeholder management and relevant GxP requirements Qualification : Bachelor's or master's degree with 5+ years of experience and proficiency within IT 3+ years of working in pharmaceutical industry regulatory industry. Strong experience in in Change and Release Management along with incident and problem management. Strong compliance understanding and documentation experience (preferably knowledge on GxP) Global stakeholder management experience. Preferably knowledge on Agile ways of working. Experienced in the use of office desktop applications and tools. Knowledge on compliance, Validation, and documentation. High degree of flexibility and good interpersonal skills. Excellent communication skills written and orally in English. Design, develop, test and document Business requirements. Knowledgeable in Linux, AWS, Windows server Understanding of IT Landscapes and keen to work on diverse IT systems Autonomous review and clarification of business requirements before implementation, creation, and documentation of technical design/entity-relationship including alignment with stakeholders Autonomous coordination of test activities with business and IT stakeholders. Good acumen of business functions with a Service Delivery mindset. Flexible for any IT task ranging on new technology, Support & Development Understanding of the basic global infrastructure of the AWS Cloud. Knowledge of the core AWS services, including computer, networking, storage, and databases. Familiarity with AWS architectural principles, such as the AWS Well-Architected Framework, which includes security, reliability, performance efficiency, operational excellence, and cost optimization. Understanding the basic functions and use cases of primary AWS services such as Amazon EC2, Amazon S3, Amazon RDS, and Amazon VPC. Knowledge of the AWS shared responsibility model and basic security and compliance aspects of the AWS platform. Understanding of basic security and compliance services offered by AWS, including AWS Identity and Access Management (IAM), Amazon CloudWatch, and AWS Trusted Advisor. Basic understanding of deploying applications in the AWS Cloud and managing operational services. Awareness of technical assistance resources provided by AWS, including AWS support plans and the AWS Knowledge Center. Familiarity with Unix servers (must), Oracle PLSQL, and Windows servers. Understanding of billing practices and how to interpret billing documents. Knowledge of different pricing models for various AWS services and how to use cost management tools. This position offers the chance to work in a global environment and enjoy the atmosphere in an open-minded international team.

Posted 6 hours ago

Apply

2.0 - 4.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

dunnhumby is the global leader in Customer Data Science, empowering businesses everywhere to compete and thrive in the modern data-driven economy. We always put the Customer First. Our mission: to enable businesses to grow and reimagine themselves by becoming advocates and champions for their Customers. With deep heritage and expertise in retail – one of the world’s most competitive markets, with a deluge of multi-dimensional data – dunnhumby today enables businesses all over the world, across industries, to be Customer First. dunnhumby employs nearly 2,500 experts in offices throughout Europe, Asia, Africa, and the Americas working for transformative, iconic brands such as Tesco, Coca-Cola, Meijer, Procter & Gamble and Metro. Most companies try to meet expectations, dunnhumby exists to defy them. Using big data, deep expertise and AI-driven platforms to decode the 21st century human experience – then redefine it in meaningful and surprising ways that put customers first. Across digital, mobile and retail. For brands like Tesco, Coca-Cola, Procter & Gamble and PepsiCo. We’re looking for an Applied Data Scientist who expects more from their career. It’s a chance to apply your expertise to distil complex problems into compelling insights using the best of machine learning and human creativity to deliver effective and impactful solutions for clients. Joining our advanced data science team, you’ll investigate, develop, implement and deploy a range of complex applications and components while working alongside super-smart colleagues challenging and rewriting the rules, not just following them. What We Expect From You 2- 4 years of experience required Degree in Statistics, Maths, Physics, Economics or similar field Programming skills (Python and SQL are a must have) Analytical Techniques and Technology Logical thinking and problem solving Strong communication skills Experience with and passion for connecting your work directly to the customer experience, making a real and tangible impact Statistical Modelling and experience of applying data science into client problems. What You Can Expect From Us We won’t just meet your expectations. We’ll defy them. So you’ll enjoy the comprehensive rewards package you’d expect from a leading technology company. But also, a degree of personal flexibility you might not expect. Plus, thoughtful perks, like flexible working hours and your birthday off. You’ll also benefit from an investment in cutting-edge technology that reflects our global ambition. But with a nimble, small-business feel that gives you the freedom to play, experiment and learn. And we don’t just talk about diversity and inclusion. We live it every day – with thriving networks including dh Gender Equality Network, dh Proud, dh Family, dh One and dh Thrive as the living proof. Everyone’s invited. Our approach to Flexible Working At dunnhumby, we value and respect difference and are committed to building an inclusive culture by creating an environment where you can balance a successful career with your commitments and interests outside of work. We believe that you will do your best at work if you have a work / life balance. Some roles lend themselves to flexible options more than others, so if this is important to you please raise this with your recruiter, as we are open to discussing agile working opportunities during the hiring process. For further information about how we collect and use your personal information please see our Privacy Notice which can be found (here) What You Can Expect From Us We won’t just meet your expectations. We’ll defy them. So you’ll enjoy the comprehensive rewards package you’d expect from a leading technology company. But also, a degree of personal flexibility you might not expect. Plus, thoughtful perks, like flexible working hours and your birthday off. You’ll also benefit from an investment in cutting-edge technology that reflects our global ambition. But with a nimble, small-business feel that gives you the freedom to play, experiment and learn. And we don’t just talk about diversity and inclusion. We live it every day – with thriving networks including dh Gender Equality Network, dh Proud, dh Family, dh One, dh Enabled and dh Thrive as the living proof. We want everyone to have the opportunity to shine and perform at your best throughout our recruitment process. Please let us know how we can make this process work best for you. Our approach to Flexible Working At dunnhumby, we value and respect difference and are committed to building an inclusive culture by creating an environment where you can balance a successful career with your commitments and interests outside of work. We believe that you will do your best at work if you have a work / life balance. Some roles lend themselves to flexible options more than others, so if this is important to you please raise this with your recruiter, as we are open to discussing agile working opportunities during the hiring process. For further information about how we collect and use your personal information please see our Privacy Notice which can be found (here) Show more Show less

Posted 6 hours ago

Apply

6.0 - 11.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

As an Senior Automation Analyst, you will have the unique opportunity to influence and drive decisions with an advanced logical mindset, process oriented and a background in leveraging technologies to drive results. Provides primary support for assigned systems and applications. Works independently with limited support. Prior Experience in Healthcare domain will be an added advantage. Excellent interpersonal & competent analytical skills. Strong team player attitude to deliver necessary requirements on time and ability to work through obstacles under challenging circumstances. Should be able to work in shifts as required by the project. Be self-motivated and eager to engage in high impact, challenging engineering problems while continuously raising the standard of quality in our products and services. Willing to adapt new technologies and skills in automation. Key Responsibilities: Analyzes the business processes and assessing the automation potentials. Expertise in end-to-end development of bots and implementing the full life cycle of process automation solutions: identify automation opportunities, designing solutions, develop, test, & deploy using RPA, Chatbot, IDP, Test automation and etc... Expertise in designing and implementing intelligent automation solution using Machine Learning, Deep Learning, Python etc. Ability to build reusable components which can be shared and used across different projects to save efforts. Responsibilities to set a strategic direction for automation and managing day-to-day operations, establish high performing service delivery organization. Collaborates with others on the project to brainstorm and finding efficient ways to tackle technical, infrastructure, security, or development problem. Plans and accompanies the smooth transition from project to stable operations from a technical perspective, Demonstrate technical leadership with incident handling and troubleshooting. Ensures adequate technical project documentation. Develops automation portfolio gathering requirements from the functions / cross-functions. Actively manages and monitors the overall project schedule and milestones. Coordinates tasks and communication both within and outside of the project team, ensuring communication with all relevant stakeholders. Pro-actively assesses and mitigates project risks. Ability to familiarize himself / herself independently with entirely new subjects and thus provide comprehensive and innovative solutions. Minimum Qualifications: Minimum of 6+years of experience working with UiPath. Minimum 6 years of Experience with automation technologies like UiPath's RPA, Automation Hub, Process Mining, Insights, Document Understanding, AI Center, Action Center, RASA framework, AI/ML,NLP, Microsoft's Powerautomate and etc. Experience in UiPath's test automation solutions. Experience of development in agile methodology and following best practices. Experience in Project tracking and Defect Management tools such as JIRA, HP ALM and etc. Experience in one of the standard scripting languages such as Python,.NET, PL/SQL and etc. Experience in managing and maintaining applications hosted on cloud. Debugging and troubleshooting skills, with an enthusiastic attitude to support and resolve problems. Working knowledge of information security and best practices. Service Delivery Management mindset. Minimum of Bachelors degree or similar degree applicable experience within information systems. Excellent written and verbal communications skills. Preferred Qualifications: Experience in building dashboards and aggregating metrics. Experience building GenAI & RAG solution using Azure OpenAI. Flexibility to adopt to new technologies and drive initiatives to integrate these technologies with existing platform/technologies available. Exposure to working with RASA or other chatbot technologies. Exposure to large-scale systems and application architectures. Experience in creating and managing technical documents. Experience in GxP process will be an added advantage. Linux and Windows scripting (BASH and Powershell). Working knowledge of Network Topology and communications protocols e.g. TCP/IP, Telnet, SMTP, FTP. General DevOps automation, build, deployment, configuration of multiple environments.

Posted 6 hours ago

Apply

8.0 - 10.0 years

11 - 15 Lacs

Gurugram

Work from Office

Naukri logo

Design, construct, and maintain scalable data management systems using Azure Databricks, ensuring they meet end-user expectations. Supervise the upkeep of existing data infrastructure workflows to ensure continuous service delivery. Create data processing pipelines utilizing Databricks Notebooks, Spark SQL, Python and other Databricks tools. Oversee and lead the module through planning, estimation, implementation, monitoring and tracking. Desired Skills and Experience Over 8 + years of experience in data engineering, with expertise in Azure Data Bricks, MSSQL, LakeFlow, Python and supporting Azure technology. Design, build, test, and maintain highly scalable data management systems using Azure Databricks. Create data processing pipelines utilizing Databricks Notebooks, Spark SQL. Integrate Azure Databricks with other Azure services like Azure Data Lake Storage, Azure SQL Data Warehouse. Design and implement robust ETL pipelines using ADF and Databricks, ensuring data quality and integrity. Collaborate with data architects to implement effective data models and schemas within the Databricks environment. Develop and optimize PySpark/Python code for data processing tasks. Assist stakeholders with data-related technical issues and support their data infrastructure needs. Develop and maintain documentation for data pipeline architecture, development processes, and data governance. Data Warehousing: In-depth knowledge of data warehousing concepts, architecture, and implementation, including experience with various data warehouse platforms. Extremely strong organizational and analytical skills with strong attention to detail Strong track record of excellent results delivered to internal and external clients. Excellent problem-solving skills, with ability to work independently or as part of team. Strong communication and interpersonal skills, with ability to effectively engage with both technical and non-technical stakeholders. Able to work independently without the needs for close supervision and collaboratively as part of cross-team efforts. Key Responsibilities Interpret business requirements, either gathered or acquired. Work with internal resources as well as application vendors Designing, developing, and maintaining Data Bricks Solution and Relevant Data Quality rules Troubleshooting and resolving data related issues. Configuring and Creating Data models and Data Quality Rules to meet the needs of the customers. Hands on in handling Multiple Database platforms. Like Microsoft SQL Server, Oracle etc Reviewing and analyzing data from multiple internal and external sources Analyse existing PySpark/Python code and identify areas for optimization. Write new optimized SQL queries or Python Scripts to improve performance and reduce run time. Identify opportunities for efficiencies and innovative approaches to completing scope of work. Write clean, efficient, and well-documented code that adheres to best practices and Council IT coding standards. Maintenance and operation of existing custom codes processes Participate in team problem solving efforts and offer ideas to solve client issues. Query writing skills with ability to understand and implement changes to SQL functions and stored procedures. Effectively communicate with business and technology partners, peers and stakeholders Ability to deliver results under demanding timelines to real-world business problems. Ability to work independently and multi-task effectively. Configure system settings and options and execute unit/integration testing. Develop end-user Release Notes, training materials and deliver training to a broad user base. Identify and communicate areas for improvement Demonstrate high attention to detail, should work in a dynamic environment whilst maintaining high quality standards, a natural aptitude to develop good internal working relationships and a flexible work ethic. Responsible for Quality Checks and adhering to the agreed Service Level Agreement (SLA) / Turn Around Time (TAT)

Posted 6 hours ago

Apply

10.0 - 15.0 years

11 - 15 Lacs

Gurugram

Work from Office

Naukri logo

Design, construct, and maintain scalable data management systems using Azure Databricks, ensuring they meet end-user expectations. Supervise the upkeep of existing data infrastructure workflows to ensure continuous service delivery. Create data processing pipelines utilizing Databricks Notebooks, Spark SQL, Python and other Databricks tools. Oversee and lead the module through planning, estimation, implementation, monitoring and tracking. Desired Skills and experience 10+ years of experience in software development using Python, PySpark and its frameworks. Proven experience as a Data Engineer with experience in Azure cloud. Experience implementing solutions using - Azure cloud services, Azure Data Factory, Azure Lake Gen 2, Azure Databases, Azure Data Fabric, API Gateway management, Azure Functions Design, build, test, and maintain highly scalable data management systems using Azure Databricks Strong SQL skills with RDMS or noSQL databases Experience with developing APIs using FastAPI or similar frameworks in Python Familiarity with the DevOps lifecycle (git, Jenkins, etc.), CI/CD processes Good understanding of ETL/ELT processes Experience in financial services industry, financial instruments, asset classes and market data are a plus. Assist stakeholders with data-related technical issues and support their data infrastructure needs. Develop and maintain documentation for data pipeline architecture, development processes, and data governance. Data Warehousing: In-depth knowledge of data warehousing concepts, architecture, and implementation, including experience with various data warehouse platforms. Extremely strong organizational and analytical skills with strong attention to detail Strong track record of excellent results delivered to internal and external clients. Excellent problem-solving skills, with ability to work independently or as part of team. Strong communication and interpersonal skills, with ability to effectively engage with both technical and non-technical stakeholders. Able to work independently without the needs for close supervision and collaboratively as part of cross-team efforts. Key responsibilities include: Interpret business requirements, either gathered or acquired. Work with internal resources as well as application vendors Designing, developing, and maintaining Data Bricks Solution and Relevant Data Quality rules Troubleshooting and resolving data related issues. Configuring and Creating Data models and Data Quality Rules to meet the needs of the customers. Hands on in handling Multiple Database platforms. Like Microsoft SQL Server, Oracle etc Reviewing and analyzing data from multiple internal and external sources Analyze existing PySpark/Python code and identify areas for optimization. Write new optimized SQL queries or Python Scripts to improve performance and reduce run time. Identify opportunities for efficiencies and innovative approaches to completing scope of work. Write clean, efficient, and well-documented code that adheres to best practices and Council IT coding standards. Maintenance and operation of existing custom codes processes Participate in team problem solving efforts and offer ideas to solve client issues. Query writing skills with ability to understand and implement changes to SQL functions and stored procedures. Effectively communicate with business and technology partners, peers and stakeholders Ability to deliver results under demanding timelines to real-world business problems. Ability to work independently and multi-task effectively. Configure system settings and options and execute unit/integration testing. Develop end-user Release Notes, training materials and deliver training to a broad user base. Identify and communicate areas for improvement. Demonstrate high attention to detail, should work in a dynamic environment whilst maintaining high quality standards, a natural aptitude to develop good internal working relationships and a flexible work ethic. Responsible for Quality Checks and adhering to the agreed Service Level Agreement (SLA) / Turn Around Time (TAT)

Posted 6 hours ago

Apply

5.0 - 10.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

As a Senior DevOps Engineer with a strong background in Azure, you will join the Data & AI Solutions - Engineering team in our Healthcare R&D business. Your expertise will enhance cloud-based platforms in our D&A Landscape using AWS Cloud Services and Azure AD, supporting our R&D efforts in drug discovery and development. You will bridge software development, quality assurance, and IT operations, ensuring our platform is reliable, scalable, and automated. Your expertise will contribute to accelerate deployment cycles and minimize downtime. Key responsibilities include deploying new releases, maintaining Azure AD for identity management, and collaborating with cross-functional teams to advocate for DevOps best practices and guide architectural decisions. Join a multicultural team working in agile methodologies with high autonomy. The role requires office presence at our Bangalore location. Who You Are: University degree in Computer Science, Engineering, or a related field 5+ years of experience applying DevOps in solution development & delivery Proficiency in Azure DevOps incl. project configurations, repositories, pipelines, and environments Proficiency in Azure AD incl. apps registration, and authentication flows (OBO, Client Credentials) Good understanding of AWS services and cloud system design Strong experience in observability practices, including logging, monitoring, and tracing using tools like Prometheus, Grafana, ELK Stack, or AWS-native solutions. Proficiency in Infrastructure as Code (IaC) using Terraform and AWS CloudFormation for automated and repeatable cloud infrastructure deployments. Experience with configuration management tools such as Ansible for automating service provisioning, configuration and management. Knowledge of security best practices in DevOps, including secrets management, role-based access control (RBAC), and compliance frameworks Strong scripting and automation skills using Python for developing of automations and integration workflows Willingness to work in a multinational environment and cross-functional teams distributed between US, Europe (mostly, Germany) and India Sense of accountability and ownership, fast learner Fluency in English & excellent communication skills for technical and non-technical stakeholders

Posted 6 hours ago

Apply

5.0 - 7.0 years

8 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Job Purpose Design and develop automation framework and scripts. Design and execute test cases / scenarios based on the requirements. Design and execute test scripts based on the requirements. Implement QA best practice. Desired Skills and experience Candidates should have a B.E./B.Tech/MCA/MBA in Finance, Information Systems, Computer Science or a related field 5+ years of experience in Quality Assurance (Testing) working with client stakeholders Significant experience of performing testing in the Financial Services Industry. Hands-on expertise in UI testing with Puppeteer. Strong experience in API testing using SOAPUI, Maven, and Jenkins in a CI/CD pipeline. Strong experience of designing and developing automated testing. Design and develop test automation framework Understand, analyse and develop complex test data sets for automated testing Configuring Automated Test Environment (ATE) Participating in Automated Environment Setup with an Integrated Development Environment (IDE) Identify and select the test cases for automation and/or create, enhance, debug and execute automation test cases. Deep understanding of FSI trading platforms and tools (e.g., Polaris) and fixed income products. Proven ability to establish QA processes and frameworks in environments with minimal existing structure. Excellent problem-solving, analytical, and communication skills. Experience working on agile methodology, Jira, Confluence, etc. Excellent communication skills, both written and verbal Extremely strong organizational and analytical skills with strong attention to detail Strong track record of excellent results delivered to internal and external clients Able to work independently without the needs for close supervision and also collaboratively as part of cross-team efforts Experience with delivering projects within an agile environment Key responsibilities include: Establish and implement comprehensive QA strategies and test plans from scratch. Address immediate pain points in UI (Puppeteer) and API (SOAPUI, Maven, Jenkins) testing, including triage and framework improvement. Strong experience in SQL. Strong experience of working with Python and its libraries like Pandas, NumPy, etc. Develop and execute test cases for both UI and API, with a focus on Fixed Income trading workflows. Drive the creation of regression test suites for critical back-office applications. Collaborate with development, business analysts, and project managers to ensure quality throughout the SDLC. Implement and utilize test management tools (e.g., X-Ray/JIRA). Provide clear and concise reporting on QA progress and metrics to management. Bring strong subject matter expertise in Financial Services Industry, particularly fixed income trading products and workflows. Ensure effective, efficient, and continuous communication (written and verbally) with global stakeholders Independently troubleshoot difficult and complex issues on dev, test, UAT and production environments Responsible for end-to-end delivery of projects, coordination between client and internal offshore teams and manage client queries Demonstrate high attention to detail, should work in a dynamic environment whilst maintaining high quality standards, a natural aptitude to develop good internal working relationships and a flexible work ethic Responsible for Quality Checks and adhering to the agreed Service Level Agreement (SLA) / Turn Around Time (TAT)

Posted 6 hours ago

Apply

5.0 - 10.0 years

14 - 18 Lacs

Bengaluru

Work from Office

Naukri logo

As a Senior IT Systems & DevOps Engineer, you will be responsible for IT systems, incident, change, and release management. You will ensure the seamless deployment of new releases, maintaining system stability, security, and compliance in a regulated environment. You will collaborate with cross-functional teams, manage stakeholder communications, and drive automation and optimization across cloud-based platforms supporting drug discovery and development. Key Responsibilities: Oversee incident, change, and release management for IT systems, ensuring compliance with regulatory standards. Manage Azure AD (Entra ID) for identity and access management, including authentication flows for internal and third-party services. Implement and advocate DevOps best practices, including CI/CD, automation, and observability across platforms. Collaborate with cross-functional teams to influence architectural decisions and align infrastructure with business goals. Ensure compliance and security within a regulated (GxP) environment, implementing RBAC, secrets management, and monitoring frameworks. Design, develop, test, and document business requirements related to IT systems and infrastructure. Coordinate and perform system management tasks, ensuring alignment with quality and compliance standards. Autonomous review and clarification of business requirements, creation of technical designs, and stakeholder alignment. Manage and optimize cloud infrastructure (AWS & Azure), including cost management and performance tuning. Deploy and manage containerized applications using Docker, Kubernetes, Helm, and ArgoCD (GitOps). Implement Infrastructure as Code (IaC) using Terraform and AWS CloudFormation. Automate workflow and integration using Python and configuration management tools like Ansible. Ensure observability, including logging, monitoring, and tracing with tools like Prometheus, Grafana, ELK Stack, and AWS-native solutions. Participate in compliance activities, including audits, patch management, and cybersecurity initiatives. Provide technical guidance and support for IT systems, assisting users and resolving incidents efficiently. Your Skills & Experience: Must-Have: 5+ years of experience in IT systems management, DevOps, cloud infrastructure, and automation. Strong expertise in Change, Release, Incident, and Problem Management. Hands-on experience with Azure DevOps (project configurations, repositories, pipelines, environments). Strong knowledge of AWS services (VPC, IAM, STS, EKS, RDS, EC2, ECS, Route53, CloudWatch, CloudTrail, Secrets Manager, S3, API Gateway, Lambda, MWAA). Experience with Linux, Windows servers, and Oracle PLSQL. Strong understanding of IT landscapes and a willingness to work on diverse IT systems. Hands-on experience with containerization and Kubernetes (Docker, Helm, ArgoCD). Proficiency in Python scripting for automation and workflow integration. Experience with Infrastructure as Code (IaC) using Terraform and AWS CloudFormation. Strong experience in observability, security, and compliance frameworks, including RBAC, secrets management, and monitoring tools. Global stakeholder management experience with excellent English communication skills (written & oral). Good to Have: Experience in a regulated industry (GxP, Pharma, Life Sciences). Familiarity with Agile ways of working. Knowledge of Next.js, Storybook, Tailwind, TypeScript for front-end development. Experience with PostgreSQL, OpenSearch/Elasticsearch for data management. Familiarity with R and SAS for data analysis and statistical modeling. Understanding of AWS billing practices and cost optimization tools. Why Join Us? Work in a high-impact role contributing to cutting-edge R&D in drug discovery and development. Be part of a multicultural, agile team with high autonomy in decision-making. Exposure to a diverse tech stack combining Azure, AWS, Kubernetes, Python, and CI/CD tools. Opportunities for career growth and skill development in cloud computing, security, and automation. Work in a collaborative and innovative environment with global teams in the US, Europe, and India

Posted 6 hours ago

Apply

3.0 - 7.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

As a member of the Data and Technology practice, you will be working on advanced AI ML engagements tailored for the investment banking sector. This includes developing and maintaining data pipelines, ensuring data quality, and enabling data-driven insights. Your core responsibility will be to build and manage scalable data infrastructure that supports our proof-of-concept initiatives (POCs) and full-scale solutions for our clients. You will work closely with data scientists, DevOps engineers, and clients to understand their data requirements, translate them into technical tasks, and develop robust data solutions. Your primary duties will encompass: Develop, optimize, and maintain scalable and reliable data pipelines using tools such as Python, SQL, and Spark. Integrate data from various sources including APIs, databases, and cloud storage solutions such as Azure, Snowflake, and Databricks. Implement data quality checks and ensure the accuracy and consistency of data. Manage and optimize data storage solutions, ensuring high performance and availability. Work closely with data scientists and DevOps engineers to ensure seamless integration of data pipelines and support machine learning model deployment. Monitor and optimize the performance of data workflows to handle large volumes of data efficiently. Create detailed documentation of data processes. Implement security best practices and ensure compliance with industry standards. Experience / Skills 5+ years of relevant experience in: Experience in a data engineering role , preferably within the financial services industry . Strong experience with data pipeline tools and frameworks such as Python, SQL, and Spark. Proficiency in cloud platforms, particularly Azure, Snowflake, and Databricks. Experience with data integration from various sources including APIs and databases. Strong understanding of data warehousing concepts and practices. Excellent problem-solving skills and attention to detail. Strong communication skills, both written and oral, with a business and technical aptitude. Additionally, desired skills: Familiarity with big data technologies and frameworks. Experience with financial datasets and understanding of investment banking metrics. Knowledge of visualization tools (e.g., PowerBI). Education Bachelors or Masters in Science or Engineering disciplines such as Computer Science, Engineering, Mathematics, Physics, etc.

Posted 6 hours ago

Apply

4.0 - 6.0 years

2 - 6 Lacs

Gurugram

Work from Office

Naukri logo

As a key member of the DTS team, you will primarily collaborate closely with a global leading hedge fund on data engagements. Partner with data strategy and sourcing team on data requirements to be directly working on processes that develop the inputs to our models. Migrate from MATLAB to Databricks moving to a more modern approach to update processes Desired Skills and Experience Essential skills 4-6 years of experience with data analytics Skilled in Databricks using SQL Working knowledge of Snowflake and Python Hands-on experience on large datasets and data structures using SQL Experience working with financial and/or alternative data products Excellent analytical and strong problem-solving skills Exposure on SP Capital IQ Exposure to data models on Databricks Education: B.E./B.Tech in Computer Science or related field Key Responsibilities Ability to write data processes in Databricks using SQL. Develop ELT processes for data preparation SQL expertise to understand data sources and data structures Document the developed data processes. Assist with related data tasks for model inputs within the Databricks environment. Assist with data tasks for model inputs within Databricks environment Taking data from SP Capital IQ, prepping it, and getting it ready for the model Key Metrics SQL, Databricks, Snowflake SP Capital IQ, Data Structures Behavioral Competencies Good communication (verbal and written) Experience in managing client stakeholders

Posted 6 hours ago

Apply

4.0 - 6.0 years

6 - 11 Lacs

Gurugram

Work from Office

Naukri logo

Job Purpose As a key member of the DTS team, you will primarily collaborate closely with a leading global hedge fund on data engagements. Contribute to a variety of Development initiatives, focusing on cloud migration, automation, and application development, delivering scalable, efficient, and secure solutions, implementing DevOps best practices in multi-cloud environments, with a strong emphasis on Google Cloud Platform (GCP). Desired Skills and Experience Essential skills Bachelor's degree in computer science, Engineering, or a related field. 3+ experience in software development using C#, MSSQL, Python, and GCP/BigQuery. Strong problem-solving skills and attention to detail. Excellent communication and teamwork abilities. Experience in code reviews and maintaining code quality. Ability to mentor and guide junior developers. Key Responsibilities Design and Development: Contribute to the design and development of innovative software solutions that meet business requirements. Application Development: Develop and maintain applications using specified technologies such as C#, MSSQL, Python, and GCP/BigQuery. Code Reviews: Participate in code reviews to ensure high-quality code and adherence to best practices. Troubleshooting: Troubleshoot and resolve technical issues promptly to ensure smooth operation of applications. Collaboration: Collaborate with cross-functional teams to integrate software solutions and achieve project goals. Mentorship: Provide technical guidance and mentorship to junior team members, fostering their growth and development. Key Metrics C#, MSSQL, Python GCP/BigQuery Exposure to Devops Behavioral Competencies Good communication (verbal and written) Experience in managing client stakeholders

Posted 6 hours ago

Apply

4.0 - 6.0 years

4 - 8 Lacs

Gurugram

Work from Office

Naukri logo

As a key member of the DTS team, you will primarily collaborate closely with a global leading hedge fund on data engagements. Partner with data strategy and sourcing team on data requirements to migrate scripts from Matlab to Python. Also, work on re-creation data visualizations using Tableau/PowerBI. Desired Skills and Experience Essential skills 4-6 years of experience with data analytics Skilled in Python, PySpark, and MATLAB Working knowledge of Snowflake and SQL Hands-on experience to generate dashboards using Tableau/Power BI Experience working with financial and/or alternative data products Excellent analytical and strong problem-solving skills Working knowledge of data science concepts, regression, statistics and the associated python libraries Interest in quantitative equity investing and data analysis Familiarity with version control systems such as GIT Education: B.E./B.Tech in Computer Science or related field Key Responsibilities Re-write and enhance the existing analytics process and code from Matlab to Python Build a GUI to allow users to provide parameters for generating these reports Store the data in Snowflake tables and write queries using PySpark to extract, manipulate, and upload data as needed Re-create the existing dashboards in Tableau and Power BI Collaborate with the firms research and IT team to ensure data quality and security Engage with technical and non-technical clients as SME on data asset offerings Key Metrics Python, SQL, MATLAB, Snowflake, Pandas/PySpark Tableau, PowerBI, Data Science Behavioral Competencies Good communication (verbal and written) Experience in managing client stakeholders

Posted 6 hours ago

Apply

2.0 - 5.0 years

3 - 7 Lacs

Gurugram

Work from Office

Naukri logo

We're seeking a skilled Software Engineer with expertise in C++, Python and it would be nice to have experience of working on Large Language Models (LLM) to join our team. Desired Skills and Experience Essential skills Minimum of a bachelors degree in a technical or quantitative field with strong academic background Demonstrated ability to implement data engineering pipelines and real-time applications in C++ (python is a plus) Proficiency with C++ based tools like STL, object oriented programming in C++ is a must Experience with Linux/Unix shell/ scripting languages and Git is a must Experience with python based tools like Jupyter notebook, coding standards like pep8 is a plus Strong problem-solving skills and understanding of data structures and algorithms Experience with large-scale data processing and pipeline development Understanding of various LLM frameworks and experience with prompt engineering using Python or other scripting languages Nice to have Knowledge of natural language processing (NLP) concepts, familiarity with integrating and leveraging LLM APIs for various applications Key Responsibilities Design, develop, and maintain projects using C++ and Python along with operational support Transform a wide range of structured and unstructured data into standardized outputs for quantitative analysis and financial engineering Participate in code reviews, ensure coding standards, and contribute to the improvement of the codebase Develop the utility tools that can further automate the software development, testing and deployment workflow Collaborate with internal and external cross-functional teams Key Metrics C++ Behavioral Competencies Good communication (verbal and written), critical thinking, and attention to detail

Posted 6 hours ago

Apply

2.0 - 4.0 years

2 - 6 Lacs

Gurugram

Work from Office

Naukri logo

As a key member of the DTS team, you will primarily collaborate closely with a global leading hedge fund on data engagements. Partner with data strategy and sourcing team on data requirements to design data pipelines and delivery structures. Desired Skills and Experience Essential skills B.Tech/ M.Tech/ MCA with 2-4 years of overall experience. Skilled in Python and SQL. Experience with data modeling, data warehousing, and building data pipelines. Experience working with FTP, API, S3 and other distribution channels to source data. Experience working with financial and/or alternative data products. Experience working with cloud native tools for data processing and distribution. Experience with Snowflake and Airflow. Key Responsibilities Engage with vendors and technical teams to systematically ingest, evaluate, and create valuable data assets. Collaborate with core engineering team to create central capabilities to process, manage and distribute data assts at scale. Apply robust data quality rules to systemically qualify data deliveries and guarantee the integrity of financial datasets. Engage with technical and non-technical clients as SME on data asset offerings. Key Metrics Python, SQL. Snowflake Data Engineering and pipelines Behavioral Competencies Good communication (verbal and written) Experience in managing client stakeholders

Posted 6 hours ago

Apply

6.0 - 8.0 years

12 - 17 Lacs

Gurugram

Work from Office

Naukri logo

As a key member of the DTS team, you will primarily collaborate closely with a global leading hedge fund on data engagements. Partner with data strategy and sourcing team on data requirements to design data pipelines and delivery structures. Desired Skills and Experience Essential skills A bachelors degree in computer science, engineering, mathematics, or statistics 6-8 years of experience in a Data Engineering role, with a proven track record of delivering insightful and value add dashboards Experience writing Advanced SQL queries, Python and a deep understanding of relational databases Experience working within an Azure environment Experience with Tableau, Holland Mountain ATLAS is a plus. Experience with master data management and data governance is a plus. Ability to prioritize multiple projects simultaneously, problem solve, and think outside the box Key Responsibilities Develop, test and release Data packages for Tableau Dashboards to support all business functions, including investments, investor relations, marketing and operations Support ad hoc requests, including the ability to write queries and extract data from a data warehouse Assist with the management and maintenance of an Azure environment Maintain a data dictionary, which includes documentation of database structures, ETL processes and reporting dependencies Key Metrics Python, SQL Data Engineering, Azure and ATLAS Behavioral Competencies Good communication (verbal and written) Experience in managing client stakeholders

Posted 6 hours ago

Apply

5.0 - 8.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Design, develop, and implement Robotic Process Automation (RPA) solutions using the ServiceNow platform. Collaborate with business analysts and stakeholders to understand business requirements and translate them into technical specifications. Develop and configure ServiceNow workflows, scripts, and integrations to automate business processes. Perform testing, debugging, and troubleshooting of RPA solutions to ensure optimal performance and reliability. Maintain and update existing RPA solutions to adapt to changing business needs and requirements. Provide technical support and training to end-users and other team members. Document RPA solutions, including design specifications, test plans, and user guides. Stay up-to-date with the latest RPA and ServiceNow trends, technologies, and best practices. Candidate Profile Bachelor's degree in computer science, Information Technology, or a related field. Proven experience as an RPA Developer, with a focus on the ServiceNow platform. Strong knowledge of ServiceNow development, including workflows, scripting, and integrations. Proficiency in programming languages such as JavaScript, Python, or similar. Experience with RPA tools and technologies (e.g., UiPath, Automation Anywhere, Blue Prism) is a plus. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Ability to work independently and as part of a team. Specific Skills ServiceNow Certified System Administrator (CSA) or ServiceNow Certified Application Developer (CAD) certification. Experience with Agile/Scrum methodologies.

Posted 6 hours ago

Apply

Exploring Python Jobs in India

Python has become one of the most popular programming languages in India, with a high demand for skilled professionals across various industries. Job seekers in India have a plethora of opportunities in the field of Python development. Let's delve into the key aspects of the Python job market in India:

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Chennai
  5. Mumbai

Average Salary Range

The average salary range for Python professionals in India varies based on experience levels. Entry-level positions can expect a salary between INR 3-6 lakhs per annum, while experienced professionals can earn between INR 8-20 lakhs per annum.

Career Path

In the field of Python development, a typical career path may include roles such as Junior Developer, Developer, Senior Developer, Team Lead, and eventually progressing to roles like Tech Lead or Architect.

Related Skills

In addition to Python proficiency, employers often expect professionals to have skills in areas such as: - Data Structures and Algorithms - Object-Oriented Programming - Web Development frameworks (e.g., Django, Flask) - Database management (e.g., SQL, NoSQL) - Version control systems (e.g., Git)

Interview Questions

  • What is the difference between list and tuple in Python? (basic)
  • Explain the concept of list comprehensions in Python. (basic)
  • What are decorators in Python? (medium)
  • How does memory management work in Python? (medium)
  • Differentiate between __str__ and __repr__ methods in Python. (medium)
  • Explain the Global Interpreter Lock (GIL) in Python. (advanced)
  • How can you handle exceptions in Python? (basic)
  • What is the purpose of the __init__ method in Python? (basic)
  • What is a lambda function in Python? (basic)
  • Explain the use of generators in Python. (medium)
  • What are the different data types available in Python? (basic)
  • Write a Python code to reverse a string. (basic)
  • How would you implement multithreading in Python? (medium)
  • Explain the concept of PEP 8 in Python. (basic)
  • What is the difference between append() and extend() methods in Python lists? (basic)
  • How do you handle circular references in Python? (medium)
  • Explain the use of virtual environments in Python. (basic)
  • Write a Python code to find the factorial of a number using recursion. (medium)
  • What is the purpose of __name__ variable in Python? (medium)
  • How can you create a virtual environment in Python? (basic)
  • Explain the concept of pickling and unpickling in Python. (medium)
  • What is the purpose of the pass statement in Python? (basic)
  • How do you debug a Python program? (medium)
  • Explain the concept of namespaces in Python. (medium)
  • What are the different ways to handle file input and output operations in Python? (medium)

Closing Remark

As you explore Python job opportunities in India, remember to brush up on your skills, prepare for interviews diligently, and apply confidently. The demand for Python professionals is on the rise, and this could be your stepping stone to a rewarding career in the tech industry. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies