Home
Jobs

585 Pandas Jobs - Page 19

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

2 - 6 Lacs

Bengaluru

Work from Office

Naukri logo

Drive projects and initiatives to improve and expand existing DLP platform capabilities. Implement and manage DLP solutions, including data labeling technologies. Work directly with stakeholders to define requirements and develop security solutions. Ensure the security of endpoint devices, including data protection and behavior analysis. Deploy and manage endpoint DLP solutions, such as those offered by Proofpoint. Manage email security policies and configurations, including DLP for email. Detect and prevent data loss through email channels, such as by identifying sensitive content and usebehavior. Gain deep knowledge of the Proofpoint platform, including its various modules (e.g., Endpoint DLP, Email DLP) Be proficient in using the Proofpoint console to manage alerts, investigate incidents, and analyze user behavior. Primary Skills DLP Endpoint Security Email Security Secondary Skills Data Labeling Behavior Analysis Incident Response

Posted 3 weeks ago

Apply

7.0 - 9.0 years

7 - 11 Lacs

Kochi, Thiruvananthapuram

Work from Office

Naukri logo

Role Summary We are looking for an experienced Python Lead Developer with over 7 years in Python programming, strong skills in Pandas for data manipulation, and expertise in Celery for distributed task management. This position is ideal for someone who enjoys leading development projects, mentoring teams, and ensuring high standards in software quality and performance. Key Responsibilities Lead Python development projects, ensuring code quality and efficiency. Guide and mentor team members, conduct code reviews, and foster collaboration. Utilize Pandas for data analysis and transformation tasks. Implement and maintain asynchronous tasks using Celery. Optimize application performance and scalability. Required Skills Core: Proficient in Python, Pandas, and Celery. Additional: Knowledge of relational databases, RESTful API integration, and Git. Soft Skills: Strong leadership, problem-solving, and communication abilities. Keywords Pandas,Celery,RESTful API integration,Git,data analysis,Python*

Posted 3 weeks ago

Apply

5.0 - 10.0 years

14 - 24 Lacs

Hyderabad, Bengaluru

Hybrid

Naukri logo

Experience Range : 5 - 10+ Year's Work Location : Bangalore Must Have Skills : Pandas, Hypothesis testing, A/Btesting, feature engineering, statistical analysis, Machine Learning, NumPy, Python, SQL Good To Have Skills : Databricks Job Description : Develop, implement, and optimize machine learning models for predictive analytics and decision-making. Work with structured and unstructured data to extract meaningful insights and patterns. Utilize Python and standard data science libraries such as NumPy, Pandas, SciPy, Scikit-Learn, TensorFlow, PyTorch, and Matplotlib for data analysis and model building. Design and develop data pipelines for efficient processing and analysis. Conduct exploratory data analysis (EDA) to identify trends and anomalies. Collaborate with cross-functional teams to integrate data-driven solutions into business strategies. Use data visualization and storytelling techniques to communicate complex findings to non-technical stakeholders. Stay updated with the latest advancements in machine learning and AI technologies. Required Qualifications 5 years of hands-on experience in data science and machine learning. Strong proficiency in Python and relevant data science packages. Experience with machine learning frameworks such as TensorFlow, Keras, or PyTorch. Knowledge of SQL and database management for data extraction and manipulation. Expertise in statistical analysis, hypothesis testing, and feature engineering . Understanding of Marketing Mix Modeling is a plus. Experience in data visualization tools such as Matplotlib, Seaborn, or Plotly package Strong problem-solving skills and ability to work with large datasets. Excellent communication skills with a knack for storytelling using data. Preferred Qualifications Experience with cloud platforms such as AWS, Azure, or GCP. Knowledge of big data technologies like Hadoop, Spark, or Databricks. Exposure to NLP, computer vision, or deep learning techniques. Understanding of A/B testing and experimental design .

Posted 3 weeks ago

Apply

7.0 - 12.0 years

20 - 30 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Naukri logo

Role & responsibilities We are seeking a skilled Python Developer to join our dynamic team. The ideal candidate will be responsible for developing, maintaining, and improving backend services and applications, ensuring high performance and responsiveness to requests from the front end. If you have a passion for problem-solving, clean code, and scalable applications we want to hear from you! Key Responsibilities: Develop, test, and maintain robust and scalable Python-based applications. Collaborate with front-end developers, product managers, and other stakeholders to integrate user-facing elements with server-side logic. Build and maintain APIs, services, and data pipelines. Write clean, reusable, and efficient code following best practices. Participate in code reviews and provide constructive feedback. Troubleshoot and debug applications, optimizing performance. Integrate data storage solutions (SQL and NoSQL databases). Stay up to date with emerging trends and technologies in software development. Required Skills & Qualifications: Bachelors degree in Computer Science, Engineering, or a related field (or equivalent practical experience). Proficient in Python 3.x with a strong understanding of Pythonic principles. Experience with Python frameworks like Django , Flask , or FastAPI . Strong knowledge of RESTful API design and development . Good knowledge of python, DS, Pandas, Scikit, NumPy, TensorFlow, Keras etc. AI/ML exp mandatory Experience working with relational databases (e.g., PostgreSQL , MySQL ) and/or NoSQL databases (e.g., MongoDB ). Familiarity with version control tools like Git . Good understanding of software testing, debugging, and performance optimization. Solid problem-solving skills and attention to detail. Preferred (Bonus) Skills: Experience with Docker , Kubernetes , or other containerization technologies. Familiarity with cloud platforms (AWS, Azure, GCP). Knowledge of asynchronous programming (e.g., asyncio). Understanding of CI/CD pipelines . Experience in data science, machine learning, or DevOps is a plus

Posted 3 weeks ago

Apply

2.0 - 5.0 years

7 - 15 Lacs

Hyderabad

Work from Office

Naukri logo

We are looking for a Data Engineer with the tech stack of Python, Pandas, Postgres, Java, and Apache Flink. Must have experience in Python, Java in the data ingestion pipeline with Apache Flink.

Posted 3 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Gurugram

Work from Office

Naukri logo

Department ISS DELIVERY - DEVELOPMENT - GURGAON Level 3 About your team The Investment Solutions Services (ISS) delivery team provides team provides systems development, implementation and support services for FILs global Investment Management businesses across asset management lifecyle. We support Fund Managers, Research Analysts, Traders and Investment Services Operations in all of FILs international locations, including London, Hong Kong, and Tokyo About your role You will be joining this position as Senior Test Analyst in QA chapter, and therefore be responsible for executing testing activities for all applications under IM technology based out of India. Here are the expectations and probably how your day in a job will look like Understand business needs and analyse requirements and user stories to carry out different testing activities. Collaborate with developers and BAs to understand new features, bug fixes, and changes in the codebase. Create and execute functional as well as automated test cases on different test environments to validate the functionality Log defects in defect tracker and work with PMs and devs to prioritise and resolve them. Develop and maintain automation script , preferably using python stack. Deep understanding of databases both relational as well as non-relational. Document test cases , results and any other issues encountered during testing. Attend team meetings and stand ups to discuss progress, risks and any issues that affects project deliveries Stay updated with new tools, techniques and industry trends. About You Seasoned Software Test analyst with more than 5+ years of hands on experience Hands-on experience in automating web and backend automation using open source tools ( Playwright, pytest, Selenium, request, Rest Assured, numpy , pandas). Proficiency in writing and understanding complex db queries in various databases ( Oracle, Snowflake) Good understanding of cloud ( AWS , Azure) Preferable to have finance investment domain. Strong logical reasoning and problem solving skills. Preferred programming language Python and Java. Familiarity with CI/CD tools (e.g., Jenkins, GitLab CI) for automating deployment and testing workflows

Posted 3 weeks ago

Apply

0.0 - 1.0 years

1 - 2 Lacs

Bengaluru

Remote

Naukri logo

Are you an exceptionally skilled and highly motivated individual with a deep passion for quantitative finance, data science, end-to-end automation, cutting-edge AI tools, and creating impactful educational content ? Do you possess an unparalleled command of Python, advanced mathematics, statistical modeling, and the ability to effectively communicate complex concepts for online learning? If so, we have an extraordinary opportunity for you! We are seeking a Highly Paid Quant Analyst Intern for a demanding yet incredibly rewarding 6-month remote internship in India. This role is designed for ambitious individuals who are ready to dive deep into real-world quantitative challenges, drive efficiency through comprehensive automation, leverage the power of new AI tools (like Claude, ChatGPT, Perplexity, Gemini), and crucially, develop high-quality quantitative education content for our online platforms. Key Responsibilities: Quantitative Research & Model Development: Apply your expertise to research, develop, and refine quantitative models, trading strategies, and analytical tools. Quantitative Education Content Creation: Design, develop, and refine engaging and accurate quantitative education content (e.g., lessons, exercises, case studies, coding tutorials) for Gotraddy's online courses and training programs. Data Science & Analytics: Perform in-depth analysis of financial datasets, identify patterns, and extract actionable insights to inform and enhance our course materials and practical exercises. End-to-End Workflow Automation: Design, build, and maintain robust automated pipelines and tools for data processing, model testing, and the creation of interactive learning simulations and demonstrations, leveraging tools like n8n and Make . AI Tool Integration & Prompt Engineering: Proactively explore and effectively utilize new AI tools (e.g., Claude, ChatGPT, Perplexity, Gemini ) through advanced prompt engineering to assist in research, content generation, problem-solving, and efficiency improvements for educational material. Mathematical & Statistical Application: Leverage your understanding of advanced mathematical concepts (e.g., probability, stochastic calculus, optimization) to create clear, practical examples and solve complex problems for educational purposes. Content Validation & Improvement: Rigorously test and validate existing and new quantitative content, ensuring accuracy, relevance, pedagogical effectiveness, and a seamless online learning experience. Collaboration & Innovation: Work closely with our experienced educators and content creators to translate complex quantitative concepts into accessible and engaging learning experiences. What We're Looking For: Education: Currently pursuing or recently completed a Bachelor's, Master's, or Ph.D. in a highly quantitative field such as Quantitative Finance, Mathematics, Statistics, Computer Science, Data Science, or a related discipline. Exceptional Python Skills: Expert-level proficiency in Python, including extensive experience with libraries like NumPy, Pandas, SciPy, Scikit-learn , and ideally, specialized quantitative finance libraries. Strong Data Science Acumen: Proven ability in data manipulation, statistical analysis, machine learning (regression, classification, time series), and data visualization. Solid Mathematical Foundation: Deep understanding of linear algebra, multivariate calculus, probability theory, stochastic processes, and numerical methods. Automation Prowess: Demonstrated experience in automating complex workflows and processes using Python scripting and/or workflow automation tools like n8n or Make. Proficiency in Prompt Engineering: Demonstrated ability to effectively use and extract valuable insights from large language models and other AI tools (e.g., Claude, ChatGPT, Perplexity, Gemini). Content Creation Aptitude: Strong ability to articulate complex quantitative concepts clearly, concisely, and engagingly for an online learning audience. Experience with educational content development is a significant plus. Highly Motivated: A proactive, self-starter attitude with a strong desire to learn, contribute, and elevate educational content. Problem-Solving: Excellent analytical and problem-solving skills, with a keen eye for detail. Communication: Strong verbal and written communication skills. Location: Ability to work remotely from India. Desired Skills: While not strictly mandatory, candidates possessing the following skills will be highly regarded: Advanced Python Development: Experience with more complex Python frameworks or building robust applications. In-depth Data Science Techniques: Exposure to advanced topics like deep learning for financial applications, causal inference, or advanced time series analysis. Mathematical Sophistication: Knowledge of advanced optimization, numerical methods, or stochastic calculus applied to finance. AI Tool Power User: Proven track record of leveraging AI tools (Claude, ChatGPT, Perplexity, Gemini) for complex problem-solving, code generation, or sophisticated content ideation. Workflow Automation Mastery: Experience in designing and implementing complex, multi-step automated workflows using tools like n8n or Make for diverse applications. Educational Content Design: Prior experience in designing curricula, writing lessons, or creating interactive learning modules for quantitative subjects. Creative Writing & Pedagogy: Ability to distill complex technical information into clear, engaging, and creatively presented educational materials suitable for diverse learning styles. What We Offer: Exceptional Compensation: This is a highly competitive and very well-paid internship , acknowledging your top-tier skills and potential contribution to our firm. Impactful Contribution: Your work will directly enhance the learning experience for aspiring quantitative professionals globally, shaping the future of quantitative education. Deep Learning & Growth: An unparalleled opportunity to deepen your quantitative skills and master the integration of cutting-edge AI, workflow automation, and educational content creation. Mentorship: Direct mentorship from seasoned quantitative professionals and educators who are also exploring AI and automation frontiers. Dynamic Remote Environment: A collaborative, intellectually stimulating, and fast-paced remote work environment that fosters innovation in quantitative education, AI application, and content development. Flexibility: The convenience of a remote role, allowing you to work from anywhere in India. Career Pathway: Strong potential for a full-time conversion offer upon successful completion of the internship, contributing to our core education and research initiatives. Internship Duration: 6 Months Location: Remote (India)

Posted 3 weeks ago

Apply

0.0 - 1.0 years

1 - 4 Lacs

Bengaluru

Work from Office

Naukri logo

# Only apply if you have completed Master's in Statistics # 0 to 1years Experience About the Company izmocars (www.izmoltd.com / www.izmocars.com) is a leading Interactive Media and Internet Solutions company with a strong presence in the Automotive industry. With Interactive Media studios in Long Beach, CA & Brussels, Europe, we are the largest producers of automotive content in the world. Our products include Interactive Media Solutions for Automotive, Web Platform, CRM, Online Marketing and Virtual Reality Platform for the Automotive industry. About FrogData FrogData, a leading AI-driven analytics platform, provides deep insights and decision intelligence to automotive dealerships. Our solutions leverage advanced statistical models, machine learning, and big data analytics to optimize business performance. We are looking for passionate MSc Statistics Freshers who can contribute to our data science and analytics team. What You Will DO: Work with large datasets to identify trends, patterns, and insights relevant to automotive dealerships. Develop statistical models, forecasting techniques, and predictive analytics for various business functions. Apply regression analysis, hypothesis testing, time series analysis, and other statistical techniques to solve business problems. Assist in data preprocessing, cleaning, and transformation to ensure high-quality analytics. Generate reports and dashboards using statistical tools and visualization platforms like Python . Required Qualifications & Skills MSc in Statistics from a Tier 1 / reputed university . Strong foundation in statistical concepts, probability, and data analysis techniques. Knowledge of R, Python (Pandas, NumPy, Scikit-learn, Stats models) for data manipulation and statistical modelling. Familiarity with SQL for querying and managing databases. Basic understanding of machine learning concepts is a plus. Strong analytical and problem-solving skills with keen attention to detail. Good communication skills to present data-driven insights effectively.

Posted 3 weeks ago

Apply

5.0 - 10.0 years

15 - 22 Lacs

Chennai

Work from Office

Naukri logo

RAG Pipeline Architectures Fine/Prompt/Instruction Tuning of LLMs machine learning frameworks (like Keras or PyTorch) and libraries (like scikit-learn). data wrangling, data cleaning, data preprocessing, and data

Posted 3 weeks ago

Apply

3.0 - 5.0 years

10 - 20 Lacs

Bengaluru

Remote

Naukri logo

Role & responsibilities We are seeking a highly skilled QA Automation Engineer with 3-5 years of experience in software development, SDET, or QA automation, with a strong focus on backend systems, APIs, or complex data pipelines. This role demands deep expertise in Python programming and a strong understanding of automation tools, CI/CD processes, and API testingparticularly within evolving AI-driven environments. Design, implement, and maintain robust automation frameworks using Python (Pytest or similar). Conduct thorough testing of backend systems , data pipelines, and RESTful APIs . Build test cases and scripts to support automated validation of AI-integrated services. Use tools like Postman or the requests library for API testing. Manage source control with Git and integrate tests into CI/CD pipelines (Jenkins, GitLab CI, GitHub Actions). Analyze and document defects with clarity, and collaborate with developers for resolution. Continuously adapt and learn new testing approaches for modern AI/ML-powered systems . Must-Have Skills 35 years of experience in software development, QA automation, or SDET roles. Strong hands-on programming skills in Python . Proven experience with automation frameworks like Pytest . Solid experience in REST API testing and tools like Postman or requests. Familiarity with CI/CD tools such as Jenkins, GitLab CI, or GitHub Actions. Excellent communication skills with the ability to articulate technical concepts clearly. Fast learner with the ability to adapt quickly to evolving AI/ML technologies . Preferred Qualifications Experience with cloud platforms (AWS, GCP, Azure), especially AI/ML services. Exposure to testing AI-based features or systems, especially in the telecom domain . Knowledge of UI automation tools like Selenium or Playwright (nice to have) Familiarity with structured testing of AI service pipelines and performance benchmarking. What We Offer Remote work flexibility Competitive compensation based on skill and experience Opportunity to work on cutting-edge AI-powered platforms A collaborative, learning-first culture with global project exposure

Posted 3 weeks ago

Apply

2.0 - 4.0 years

3 - 6 Lacs

Surat, Gujarat

Work from Office

Naukri logo

Job Summary: Our team is in search of a Senior Python Developer with a strong focus on Django to lead our application development efforts. The ideal candidate will have a track record of building and optimizing high-scale applications and will take a proactive role in improving the performance and robustness of our systems. Key Responsibilities: - Design, build, and maintain efficient, reusable, and reliable Python code with a focus on Django framework. - Lead the development of server-side logic, ensuring high performance and responsiveness to requests from the front-end. - Integrate user-facing elements with server-side logic developed by front-end developers with strong knowledge of JavaScript frameworks. - Work with data storage solutions, including databases, key-value stores, blob stores, etc. - Provide technical leadership to troubleshoot, debug, and upgrade existing systems. - Collaborate with internal teams to identify system requirements, design architecture, and propose solutions. - Maintain code quality and organization, ensuring scalability and security of the application. - Mentor junior developers and encourage the adoption of software development best practices. Qualifications: - Minimum of 3 years of experience working with Python and the Django framework. - Solid understanding of the Django ORM and Django Rest Framework. - Experience with front-end technologies, JavaScript, and frameworks like React, Angular, or Vue.js is a plus. - Proficient understanding of Git version control tool. - Strong unit test and debugging skills.

Posted 3 weeks ago

Apply

6.0 - 11.0 years

5 - 9 Lacs

Gurugram

Work from Office

Naukri logo

POSITION SUMMARY : We are looking for a highly skilled Python Backend Developer with 6-12 years of experience with backend development, the candidate will drive projects independently while ensuring high code quality and efficiency. The role requires expertise in Python frameworks (Django, Flask), cloud platforms (AWS), and database management (Snowflake), with a strong emphasis on software best practices, problem-solving, and stakeholder collaboration. EXPERIENCE AND REQUIRED SETS : - 6-12 years of backend development experience with Python - Understanding of cloud platforms, particularly AWS. - Proficiency in using Snowflake for database management and optimization. - Experience working with data-intensive applications. - Demonstrated ability to build dynamic and static reports using Python libraries such as Pandas, Matplotlib, or Plotly. - Strong understanding of RESTful APIs and microservices architecture. - Proficiency with Python frameworks like Django, Flask, or Tornado, including basic skills required to develop and maintain applications using these frameworks. - Knowledge of both relational and non-relational databases. - Proficiency with version control systems, especially Git. - Backend DevelopmentDesign, develop, and maintain scalable and resilient backend services using Python, ensuring optimal performance and reliability. - Data-Intensive ApplicationsDevelop and manage data-intensive applications, ensuring efficient data processing and handling. - Report GenerationCreate dynamic and static reports utilizing common Python libraries (e.g., Pandas, Matplotlib, Plotly) to deliver actionable insights. - Python FrameworksUtilize frameworks such as Django, Flask, or Tornado to build and maintain robust backend systems, ensuring best practices in application architecture. - Cloud PlatformsDeploy and manage applications on cloud development platforms such as AWS and Beacon, leveraging their full capabilities to support our solutions. - Database ManagementArchitect, implement, and optimize database solutions using Snowflake to ensure data integrity and performance. - Stakeholder CollaborationEngage directly with Tech Owners and Business Owners to gather requirements, provide progress updates, and ensure alignment with business objectives. - Ownership & InitiativeTake full ownership of projects, driving them from conception through to completion with minimal supervision. - Software Best PracticesImplement and uphold software development best practices, including version control, automated testing, code reviews, and CI/CD pipelines. - GenAI Tools UtilizationUtilize GenAI tools such as GitHub Copilot to enhance coding efficiency, streamline workflows, and maintain high code quality. - Problem-SolvingProactively identify, troubleshoot, and resolve technical issues, ensuring timely delivery of solutions. Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 3 weeks ago

Apply

4.0 - 7.0 years

4 - 7 Lacs

Pune

Work from Office

Naukri logo

DesignationSenior Backend Developer (Python, Django,) Senior Backend Developer (Python, Django, AWS) Onsite in Pune! Are you a Python & Django pro who loves solving complex problems and building scalable applications Do you thrive in a fast-paced environment where your work makes an impact If yes, wed love to have you on board! Whats in it for you - Work on cutting-edge tech with Python, Django & AWS - Build and optimize high-performance backend systems - Collaborate with a passionate team of developers - Be part of a growing SaaS company making waves in hyperlocal tech Who were looking for : Experience4-7 years in backend development- Strong skills in Python, Django, AWS, RESTful APIs- Hands-on with Databases (PostgreSQL/MySQL), Celery, Redis- Comfortable with CI/CD, Git, and asynchronous programming- Onsite role in Pune -(Male candidates preferred)- Location Yerwada, PuneKey Responsibilities :- Develop, test, and maintain robust and scalable backend applications using Python and Django- Design and implement APIs (RESTful services) to ensure seamless integration with front-end systems- Optimize database performance and manage PostgreSQL/MySQL databases- Work with AWS services to deploy and scale applications in a cloud environment- Utilize Celery for task management and Redis for caching and message brokeringApplyInsightsFollow-upSave this job for future referenceDid you find something suspiciousReport Here! Hide This JobClick here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 3 weeks ago

Apply

4.0 - 5.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

About the Role - We are seeking a highly skilled and experienced Senior Data Scientist to join our data science team. - As a Senior Data Scientist, you will play a critical role in driving data-driven decision making across the organization by developing and implementing advanced analytical solutions. - You will leverage your expertise in data science, machine learning, and statistical analysis to uncover insights, build predictive models, and solve complex business challenges. Key Responsibilities - Develop and implement statistical and machine learning models (e.g., regression, classification, clustering, time series analysis) to address business problems. - Analyze large and complex datasets to identify trends, patterns, and anomalies. - Develop predictive models for forecasting, churn prediction, customer segmentation, and other business outcomes. - Conduct A/B testing and other experiments to optimize business decisions. - Communicate data insights effectively through visualizations, dashboards, and presentations. - Develop and maintain interactive data dashboards and reports. - Present findings and recommendations to stakeholders in a clear and concise manner. - Work with data engineers to design and implement data pipelines and data warehousing solutions. - Ensure data quality and integrity throughout the data lifecycle. - Develop and maintain data pipelines for data ingestion, transformation, and loading. - Stay up-to-date with the latest advancements in data science, machine learning, and artificial intelligence. - Research and evaluate new technologies and tools to improve data analysis and modeling capabilities. - Explore and implement new data science techniques and methodologies. - Collaborate effectively with data engineers, business analysts, product managers, and other stakeholders. - Communicate technical information clearly and concisely to both technical and non-technical audiences. Qualifications Essential - 4+ years of experience as a Data Scientist or in a related data science role. - Strong proficiency in statistical analysis, machine learning algorithms, and data mining techniques. - Experience with programming languages like Python (with libraries like scikit-learn, pandas, NumPy) or R. - Experience with data visualization tools (e.g., Tableau, Power BI). - Experience with data warehousing and data lake technologies. - Excellent analytical, problem-solving, and communication skills. - Master's degree in Statistics, Mathematics, Computer Science, or a related field Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 3 weeks ago

Apply

4.0 - 7.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Role Python Developer Location Bangalore Experience4 - 7 Yrs Employment Type Full Time, Working mode Regular Notice Period Immediate - 15 Days About the Role : We are seeking a skilled Python Developer to join our dynamic team and contribute to the development of innovative data-driven solutions. The ideal candidate will have a strong foundation in Python programming, data analysis, and data processing techniques. This role will involve working with various data sources, including Redis, MongoDB, SQL, and Linux, to extract, transform, and analyze data for valuable insights. You will also be responsible for developing and maintaining efficient and scalable data pipelines and visualizations using tools like matplotlib and seaborn. Additionally, experience with web development frameworks such as Flask, FastAPI, or Django, as well as microservices architecture, will be a significant advantage. Key Responsibilities : - Design, develop, and maintain efficient and scalable data pipelines to extract, transform, and load (ETL) data from various sources, including Redis, MongoDB, SQL, and Linux. - Conduct in-depth data analysis and processing using Python libraries and tools to uncover valuable insights and trends. - Develop and maintain data visualizations using matplotlib, seaborn, or other relevant tools to effectively communicate findings to stakeholders. - Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions. - Develop and maintain web applications using Python frameworks like Flask, FastAPI, or Django, adhering to best practices and coding standards. - Design and implement microservices architecture to build scalable and modular systems. - Troubleshoot and resolve technical issues related to data pipelines, applications, and infrastructure. - Stay updated with the latest trends and technologies in the data engineering and Python development landscape. Required Skills and Qualifications : - Strong proficiency in Python programming, including object-oriented programming and functional programming concepts. - Experience with data analysis and processing libraries such as pandas, NumPy, and scikit-learn. - Familiarity with data storage and retrieval technologies, including Redis, MongoDB, SQL, and Linux. - Knowledge of data visualization tools like matplotlib and seaborn. - Experience with web development frameworks such as Flask, FastAPI, or Django. - Understanding of microservices architecture and principles. - Excellent problem-solving and analytical skills. - Ability to work independently and as part of a team. - Strong communication and interpersonal skills. Preferred Skills and Qualifications : - Experience with cloud platforms (AWS, GCP, Azure). - Knowledge of containerization technologies (Docker, Kubernetes). - Familiarity with data warehousing and data lake concepts. - Experience with machine learning and deep learning frameworks (TensorFlow, PyTorch). Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 3 weeks ago

Apply

1.0 - 3.0 years

2 - 5 Lacs

Bengaluru

Work from Office

Naukri logo

Responsibilities : - Gather data from various sources (databases, spreadsheets, APIs, etc. - Identify, clean, and transform data to ensure accuracy, consistency, and integrity. - Develop and maintain data pipelines and processes for efficient data handling. - Conduct exploratory data analysis to identify trends, patterns, correlations, and anomalies. - Apply statistical techniques and data visualization tools to analyze datasets. - Interpret data and provide meaningful insights and recommendations. - Develop and maintain reports, dashboards, and other data visualizations to communicate findings effectively. - Work closely with stakeholders from different departments to understand their data needs and business questions. - Present data findings and insights in a clear and concise manner to both technical and non-technical audiences. - Participate in discussions and contribute to data-driven decision-making processes. - Document data sources, methodologies, and analysis processes. - Utilize data analysis tools and software such as Excel, SQL, and data visualization platforms (i.e. Tableau, Power BI). - Learn and adapt to new data analysis tools and technologies as needed. - May involve basic scripting or programming for data manipulation (i.e. Python). - Identify opportunities to improve data collection, analysis, and reporting processes. - Stay updated on the latest trends and best practices in data analysis. - Contribute to the development of data governance and quality standards. Qualifications - Bachelor's degree in a quantitative field such as Statistics, Mathematics, Economics, Computer Science, or a related discipline. - 1-3 years of professional experience in a data analysis role. - Strong understanding of statistical concepts and data analysis techniques. - Proficiency in SQL for querying and manipulating data from databases. - Excellent skills in Microsoft Excel, including advanced formulas and data manipulation techniques. - Experience with at least one data visualization tool such as Tableau, Power BI, or similar. - Ability to interpret data, identify patterns, and draw meaningful conclusions. - Strong analytical and problem-solving skills. - Excellent communication and presentation skills, with the ability to explain technical findings to non-technical audiences. - Strong attention to detail and a commitment to data accuracy. - Ability to work independently and as part of a team. Preferred Skills - Experience with programming languages such as Python (especially libraries like Pandas, NumPy, Matplotlib, Seaborn). - Familiarity with cloud-based data platforms (i.e., AWS, Azure, GCP). - Experience with data warehousing concepts. - Knowledge of statistical software packages (i.e., R, SPSS). - Experience with different data modeling techniques. - Exposure to machine learning concepts. - Experience working with specific industry data (i.e., marketing, sales, finance) Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 3 weeks ago

Apply

5.0 - 8.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

About The Role : As a Python Developer, you will play a critical role in our software development and data engineering initiatives. You will work closely with data engineers, architects, and other developers to build and maintain our applications and data pipelines. Your expertise in Python development, API design, and cloud technologies will be essential to your success. Responsibilities : - Design, develop, and maintain applications using the latest Python frameworks and technologies (Django, Flask, FastAPI). - Utilize Python libraries and tools (Pandas, NumPy, SQLAlchemy) for data manipulation and analysis. - Develop and maintain RESTful APIs, ensuring security, authentication, and authorization (OAuth, JWT). - Deploy, manage, and scale applications on AWS services (EC2, S3, RDS, Lambda). - Utilize infrastructure-as-code tools (Terraform, CloudFormation) for infrastructure management (Good to have). - Design and develop database solutions using PL/SQL (Packages, Functions, Ref cursors). - Implement data normalization and Oracle performance optimization techniques. - Design and develop data warehouse solutions, including data marts and ODS concepts. - Implement low-level design of warehouse solutions. - Work with Kubernetes for container orchestration, deploying, managing, and scaling applications on Kubernetes clusters.- - Utilize SnapLogic cloud-native integration platform for designing and implementing integration pipelines. Required Skills : - Expertise in Python frameworks (Django, Flask, FastAPI). - Proficiency in Python libraries (Pandas, NumPy, SQLAlchemy). - Strong experience in designing, developing, and maintaining RESTful APIs. - Familiarity with API security, authentication, and authorization mechanisms (OAuth, JWT). - Good experience and hands-on knowledge of PL/SQL (Packages/Functions/Ref cursors). - Knowledge of data normalization and Oracle performance optimization techniques. - Experience in development & low-level design of warehouse solutions. - Familiarity with Data Warehouse, Datamart and ODS concepts. - Proficiency in AWS services (EC2, S3, RDS, Lambda). Good to Have Skills : Kubernetes : - Hands-on experience with Kubernetes for container orchestration. Infrastructure as Code : - Experience with infrastructure-as-code tools (Terraform, CloudFormation). Integration Platforms : - Experience with SnapLogic cloud-native integration platform. Experience : - 5 to 8 years of experience as a Python Developer. Location : - Bangalore or Gurgaon Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 3 weeks ago

Apply

4.0 - 6.0 years

3 - 6 Lacs

Bengaluru

Work from Office

Naukri logo

Welcome to Awign Expert, a division of Awign - India's largest work-as-a-service platform. At Awign Expert, we connect skilled professionals with exciting contractual or project-based work opportunities offered by top companies. Our mission is to empower professionals by matching them with projects that align with their skills, interests, and experience. At Awign Expert, we understand the challenges faced by independent professionals in finding meaningful work and managing administrative tasks. That's why we act as a dedicated HR office for our Experts, handling the entire onboarding process, providing continuous feedback, resolving conflicts, and ensuring seamless payroll management. Our goal is to create a hassle-free environment for our Experts, allowing them to focus solely on their work without the burden of administrative complexities. By partnering with Awign Expert, professionals gain access to a vast network of renowned companies and projects across various industries. Duration 6 Months Location : Bangalore Timings : General IST Notice Period within 15 days or immediate joiner Experience 4 - 6 Years About the Role : Role Overview : We are seeking a Backend Developer with strong Python expertise to join our dynamic team. You will contribute to high-impact projects involving quantitative models, risk frameworks, and trade optimization strategies. This role requires hands-on experience in developing scalable systems, building robust databases, and implementing complex algorithms. Location Requirement : This role is based in Bengaluru. Candidates must be open to face-to-face interviews and working on-site. Key Responsibilities : Quantitative Solutions Development : - Design and implement quantitative models for portfolio analysis and alpha generation. - Develop robust risk models to optimize trade execution and performance. Back-Testing & Data Management - Build comprehensive back-testing infrastructure for strategy validation and performance monitoring. - Design and maintain databases with automated updates, anomaly detection, and job monitoring. Custom Data Collection & Analytics : - Develop advanced web scrapers for collecting datasets for research and analysis. - Create analytics frameworks and interactive dashboards to visualize portfolio performance and research insights. Required Skills Education : - Bachelor's Degree in Computer Science or related field. Technical Proficiency : - 4-6 years of professional software development experience. - Strong programming skills in Python with a focus on object-oriented design and algorithms. - Proficiency in database technologies (SQL, PostgreSQL). - Understanding of system architecture, design patterns, and scalability. Preferred Skills : - Master's Degree in Computer Science or related field. - Experience in software engineering best practices (coding standards, code reviews, testing, source control). - Familiarity with cloud services like AWS, Azure, or GCP. - Exposure to fin-tech, quantitative models, or data-driven applications. - Knowledge of data visualization tools (Tableau, Power BI) and Python libraries (Pandas, NumPy, Matplotlib). Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 3 weeks ago

Apply

4.0 - 5.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

About The Role : As a Banking Data Analyst, you will play a critical role in our clients' data-driven decision-making processes. You will work closely with marketing teams, product managers, and other stakeholders to understand their data needs and provide relevant analyses. Your expertise in SQL, marketing analytics, and banking domain knowledge will be essential to your success. Responsibilities : - Perform in-depth data analysis using SQL to extract and analyze banking data. - Develop and maintain reports and dashboards to track key performance indicators (KPIs). - Identify trends and patterns in banking data to provide actionable insights. - Analyze marketing campaign performance and provide recommendations for optimization. - Segment customer data to identify target audiences for marketing initiatives. - Develop and implement customer lifetime value (CLTV) models. - Apply knowledge of banking products and services to analyze data and provide relevant insights. - Focus on data related to deposits and home lending (good to have). - Visualize data using tools like Jupyter Notebook to communicate insights effectively. - Present findings to stakeholders in a clear and concise manner. - Ensure data accuracy and integrity. - Identify and resolve data quality issues. - Utilize Python for data analysis and automation tasks (good to have). Required Skills : - Very strong SQL skills for data extraction and analysis. - Proven experience in marketing analytics, including campaign analysis and customer segmentation. - Experience in the banking industry, with a focus on deposits and home lending (good to have). - Experience with Jupyter Notebook for data visualization and reporting. Good to Have Skills : Python Programming : - Experience with Python for data analysis and automation. Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 3 weeks ago

Apply

0.0 years

4 - 6 Lacs

Hyderabad

Work from Office

Naukri logo

AccioJob is conducting a Walk-In Hiring Drive with Gaian Solutions for the AI / ML Intern position. To Apply, Register and select your Slot here: https://go.acciojob.com/WQFGp2 We will not consider your application if you do not register and select slot via the above link. Required Skills: Python,SQL, ML libraries like (scikit-learn, pandas, TensorFlow, etc.) Eligibility: Degree: BTech/ MTech/ BCA/ MCA Branches - All branches Year of Graduation: 2023, 2024, 2025 Work Details: Work Location: Hyderabad (Work From Office) Stipend: 20-25k per month (For 3 months) CTC - 4.5 LPA - 6 LPA Evaluation Process: Round 1: Offline Assessment at AccioJob Skill Centre in Pune- 5th Floor, Office No - 7, Survey Number :116, Road: H No 3/1/1, Aria Tower, Bhosale Farm, Baner, 411007 Further Rounds (for Shortlisted Candidates only) 2 Technical Interview Rounds Important Note: Bring your laptop & earphones for the test. Register here: https://go.acciojob.com/WQFGp2

Posted 3 weeks ago

Apply

4.0 - 7.0 years

12 - 17 Lacs

Gurugram

Remote

Naukri logo

Role Characteristics: Analytics team provides analytical support to multiple stakeholders (Product, Engineering, Business development, Ad operations) by developing scalable analytical solutions, identifying problems, coming up with KPIs and monitor those to measure impact/success of product improvements/changes and streamlining processes. This will be an exciting and challenging role that will enable you to work with large data sets, expose you to cutting edge analytical techniques, work with latest AWS analytics infrastructure (Redshift, s3, Athena, and gain experience in the usage of location data to drive businesses. Working in a dynamic start up environment will give you significant opportunities for growth within the organization. A successful applicant will be passionate about technology and developing a deep understanding of human behavior in the real world. They would also have excellent communication skills, be able to synthesize and present complex information and be a fast learner. You Will: Perform root cause analysis with minimum guidance to figure out reasons for sudden changes/abnormalities in metrics Understand objective/business context of various tasks and seek clarity by collaborating with different stakeholders (like Product, Engineering Derive insights and putting them together to build a story to solve a given problem Suggest ways for process improvements in terms of script optimization, automating repetitive tasks Create and automate reports and dashboards through Python to track certain metrics basis given requirements Automate reports and dashboards through Python Technical Skills (Must have) B.Tech degree in Computer Science, Statistics, Mathematics, Economics or related fields 4-6 years of experience in working with data and conducting statistical and/or numerical analysis Ability to write SQL code Scripting/automation using python Hands on experience in data visualisation tool like Looker/Tableau/Quicksight Basic to advance level understanding of statistics Other Skills (Must have) Be willing and able to quickly learn about new businesses, database technologies and analysis techniques Strong oral and written communication Understanding of patterns/trends and draw insights from those Preferred Qualifications (Nice to have) Experience working with large datasets Experience with AWS analytics infrastructure (Redshift, S3, Athena, Boto3) Hands on experience on AWS services like lambda, step functions, Glue, EMR + exposure to pyspark What we offer At GroundTruth, we want our employees to be comfortable with their benefits so they can focus on doing the work they love. Parental leave- Maternity and Paternity Flexible Time Offs (Earned Leaves, Sick Leaves, Birthday leave, Bereavement leave & Company Holidays) In Office Daily Catered Lunch Fully stocked snacks/beverages Health cover for any hospitalization. Covers both nuclear family and parents Tele-med for free doctor consultation, discounts on health checkups and medicines Wellness/Gym Reimbursement Pet Expense Reimbursement Childcare Expenses and reimbursements Employee assistance program Employee referral program Education reimbursement program Skill development program Cell phone reimbursement (Mobile Subsidy program) Internet reimbursement Birthday treat reimbursement Employee Provident Fund Scheme offering different tax saving options such as VPF and employee and employer contribution up to 12% Basic Creche reimbursement Co-working space reimbursement NPS employer match Meal card for tax benefit Special benefits on salary account We are an equal opportunity employer and value diversity, inclusion and equity at our company. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.

Posted 3 weeks ago

Apply

3.0 - 5.0 years

3 - 7 Lacs

Mumbai

Work from Office

Naukri logo

Job Summary We are seeking a highly analytical and detail-oriented Data Specialist with deep expertise in SQL, Python, statistics, and automation. The ideal candidate will be responsible for designing robust data pipelines, analyzing large datasets, driving insights through statistical methods, and automating workflows to enhance data accessibility and business decision-making. Key Responsibilities - Write and optimize complex SQL queries for data extraction, transformation, and reporting. - Develop and maintain Python scripts for data analysis, ETL processes, and automation tasks. - Conduct statistical analysis to identify trends, anomalies, and actionable insights. - Build and manage automated dashboards and data pipelines using tools such as Airflow, Pandas, or Apache Spark. - Collaborate with cross-functional teams (product, engineering, business) to understand data needs and deliver scalable solutions. - Implement data quality checks and validation procedures to ensure accuracy and consistency. - Support machine learning model deployment and performance tracking (if applicable). - Document data flows, models, and processes for internal knowledge sharing. Key Requirements - Strong proficiency in SQL (joins, CTEs, window functions, performance tuning). - Solid experience with Python (data manipulation using Pandas, NumPy, scripting, and automation). - Applied knowledge of statistics (hypothesis testing, regression, probability, distributions). - Experience with data automation tools (Airflow, dbt, or equivalent). - Familiarity with data visualization tools (Tableau, Power BI, or Plotly) is a plus. - Understanding of data warehousing concepts (e.g., Snowflake, BigQuery, Redshift). - Strong problem-solving skills and the ability to work independently. Preferred Qualifications - Bachelor's or Masters degree in Computer Science, Data Science, Statistics, or a related field. - Exposure to cloud platforms like AWS, GCP, or Azure. Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 3 weeks ago

Apply

2.0 - 6.0 years

4 - 8 Lacs

Chennai

Work from Office

Naukri logo

Key Responsibilities : - Develop, deploy, and maintain scalable web applications using Python (Flask/Django). - Design and implement RESTful APIs with strong security and authentication mechanisms. - Work with MongoDB and other database management systems to store and query data efficiently. - Support and productize Machine Learning models, including feature engineering, training, tuning, and scoring. - Understand and apply distributed computing concepts to build high-performance systems. - Handle web hosting and deployment of applications, ensuring uptime and performance. - Collaborate with stakeholders to translate business requirements into technical solutions. - Communicate effectively with both technical and non-technical team members. - Take ownership of projects, troubleshoot production issues, and implement solutions proactively. Required Skills & Qualifications : - 3-5 years of experience in Python development, primarily with Flask (Django experience is a plus). - Solid knowledge of distributed systems and web architectures. - Hands-on experience with Machine Learning workflows and model deployment. - Experience with MongoDB and other database technologies. - Strong knowledge of RESTful API development and security best practices. - Excellent problem-solving skills and the ability to work independently. - Strong communication skills to clearly explain technical concepts to a diverse audience. Nice to Have : - Bachelor's or Master's degree in Computer Science, IT, or a related field. - Experience with data manipulation using Pandas, Spark, and handling large datasets. - Familiarity with Django framework in addition to Flask. Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 3 weeks ago

Apply

3.0 - 7.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

About US Omni's team is passionate about Commerce and Digital Transformation We've been successfully delivering Commerce solutions for clients across North America, Europe, Asia, and Australia The team has experience executing and delivering projects in B2B and B2C solutions JOB DESCRIPTION We are seeking a high-impact AI/ML Engineer to lead the design, development, and deployment of machine learning and AI solutions across vision, audio, and language modalities You'll be part of a fast-paced, outcome-oriented AI & Analytics team, working alongside data scientists, engineers, and product leaders to transform business use cases into real-time, scalable AI systems, This role demands strong technical leadership, a product mindset, and hands-on expertise in Computer Vision, Audio Intelligence, and Deep Learning, Key Responsibilities Architect, develop, and deploy ML models for multimodal problems, including vision (image/video), audio (speech/sound), and NLP tasks, Own the complete ML lifecycle: data ingestion, model development, experimentation, evaluation, deployment, and monitoring, Leverage transfer learning, foundation models, or self-supervised approaches where suitable, Design and implement scalable training pipelines and inference APIs using frameworks like PyTorch or TensorFlow, Collaborate with MLOps, data engineering, and DevOps to productionize models using Docker, Kubernetes, or serverless infrastructure, Continuously monitor model performance and implement retraining workflows to ensure accuracy over time, Stay ahead of the curve on cutting-edge AI research (e-g , generative AI, video understanding, audio embeddings) and incorporate innovations into production systems, Write clean, well-documented, and reusable code to support agile experimentation and long-term platform sustainability, Requirements Bachelors or Masters degree in Computer Science, Artificial Intelligence, DataScience, or a related field, 58+ years of experience in AI/ML Engineering, with at least 3 years in applied deep learning, Technical Skills Languages: Expert in Python; good knowledge of R or Java is a plus, ML/DL Frameworks: Proficient with PyTorch, TensorFlow, Scikit-learn, ONNX, Computer Vision: Image classification, object detection, OCR, segmentation, tracking (YOLO, Detectron2, OpenCV, MediaPipe), Audio AI: Speech recognition (ASR), sound classification, audio embedding models (Wav2Vec2, Whisper, etc ), Data Engineering: Strong with Pandas, NumPy, SQL, and preprocessing pipelines for structured and unstructured data, NLP/LLMs: Working knowledge of Transformers, BERT/LLAMA, Hugging Face ecosystem is preferred, Cloud & MLOps: Experience with AWS/GCP/Azure, MLFlow, SageMaker, Vertex AI, or Azure ML, Deployment & Infrastructure: Experience with Docker, Kubernetes, REST APIs, serverless ML inference, CI/CD & Version Control: Git, DVC, ML pipelines, Jenkins, Airflow, etc Soft Skills & Competencies Strong analytical and systems thinking; able to break down business problems into ML components, Excellent communication skills able to explain models, results, and decisions to non-technical stakeholders, Proven ability to work cross-functionally with designers, engineers, product managers, and analysts, Demonstrated bias for action, rapid experimentation, and iterative delivery of impact,

Posted 3 weeks ago

Apply

3.0 - 6.0 years

5 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Reference 250008SV Responsibilities ML OPS Engineer You will be responsible for the deployment and maintenance of the group data science platform infrastructure, on which data science pipelines are deployed and scaled To achieve this, you will collaborate with Data Scientists and Data Engineers from various business lines and the Global Technology Service infrastructure team (GTS), Roles: Implement techniques and processes for supporting the development and scaling of data science pipelines, Industrialize inference, retraining, monitoring data science pipelines, ensuring their maintainability and compliance, Provide platform support to end-users, Be attentive to the needs and requirements expressed by the end-users, Anticipate needs and necessary developments for the platform, Work closely with Data Scientists, Data Engineers, and business stakeholders, Stay updated and demonstrate a keen interest in the ML OPS domain, Environment: Cloud on-premise, AZure Python, Kubernetes Integrated vendor solutions: Dataiku, Snowflake DB: PostGreSQL Distributed computing: Spark Big Data: Hadoop, S3/Scality, MAPR Datascience: Scikit-learn, Transformers, ML Flow, Kedro, DevOps, CI/CD: JFROG, Harbor, Github Actions, Jenkins Monitoring: Elastic Search/Kibana, Grafana, Zabbix Agile ceremonies: PI planning, Sprint, Sprint Review, Refinement, Retrospectives, ? ITIL framework Required Profile required Technical Skills: Python FastApi, SqlAlchemy Numpy, Pandas, Scikit-learn, Transformers Kubernetes, Docker Pytest CI/CD: Jenkins, Ansible, GitHub Action, Harbor, Docker Soft Skills: Client Focus: Demonstrate strong listening skills, understanding, and anticipation of user needs, Team Spirit: Organize collaboration, workshops to find the best solutions, Share expertise with colleagues to find the most suitable solutions, Innovation: Propose innovative ideas, solutions, or strategies, and think out the box, Prefer simplicity over complexity, Responsibility: Take ownership, keep commitments and respect deadlines, Why join us "We are committed to creating a diverse environment and are proud to be an equal opportunity employer All qualified applicants receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, or veteran status?, Business insight At SocitGnrale, we are convinced that people are drivers of change, and that the world of tomorrow will be shaped by all their initiatives, from the smallest to the most ambitious Whether youre joining us for a period of months, years or your entire career, together we can have a positive impact on the future Creating, daring, innovating, and taking action are part of our DNA If you too want to be directly involved, grow in a stimulating and caring environment, feel useful on a daily basis and develop or strengthen your expertise, you will feel right at home with us! Still hesitating You should know that our employees can dedicate several days per year to solidarity actions during their working hours, including sponsoring people struggling with their orientation or professional integration, participating in the financial education of young apprentices, and sharing their skills with charities There are many ways to get involved, We are committed to support accelerating our Groups ESG strategy by implementing ESG principles in all our activities and policies They are translated in our business activity (ESG assessment, reporting, project management or IT activities), our work environment and in our responsible practices for environment protection, Diversity and Inclusion We are an equal opportunities employer and we are proud to make diversity a strength for our company Societe Generale is committed to recognizing and promoting all talents, regardless of their beliefs, age, disability, parental status, ethnic origin, nationality, gender identity, sexual orientation, membership of a political, religious, trade union or minority organisation, or any other characteristic that could be subject to discrimination,

Posted 3 weeks ago

Apply

Exploring Pandas Jobs in India

The job market for pandas professionals in India is on the rise as more companies are recognizing the importance of data analysis and manipulation in making informed business decisions. Pandas, a popular Python library for data manipulation and analysis, is a valuable skill sought after by many organizations across various industries in India.

Top Hiring Locations in India

Here are 5 major cities in India actively hiring for pandas roles: 1. Bangalore 2. Mumbai 3. Delhi 4. Hyderabad 5. Pune

Average Salary Range

The average salary range for pandas professionals in India varies based on experience levels. Entry-level positions can expect a salary ranging from ₹4-6 lakhs per annum, while experienced professionals can earn upwards of ₹12-18 lakhs per annum.

Career Path

Career progression in the pandas domain typically involves moving from roles such as Junior Data Analyst or Data Scientist to Senior Data Analyst, Data Scientist, and eventually to roles like Tech Lead or Data Science Manager.

Related Skills

In addition to pandas, professionals in this field are often expected to have knowledge or experience in the following areas: - Python programming - Data visualization tools like Matplotlib or Seaborn - Statistical analysis - Machine learning algorithms

Interview Questions

Here are 25 interview questions for pandas roles: - What is pandas in Python? (basic) - Explain the difference between Series and DataFrame in pandas. (basic) - How do you handle missing data in pandas? (basic) - What are the different ways to create a DataFrame in pandas? (medium) - Explain groupby() in pandas with an example. (medium) - What is the purpose of pivot_table() in pandas? (medium) - How do you merge two DataFrames in pandas? (medium) - What is the significance of the inplace parameter in pandas functions? (medium) - What are the advantages of using pandas over Excel for data analysis? (advanced) - Explain the apply() function in pandas with an example. (advanced) - How do you optimize performance in pandas operations for large datasets? (advanced) - What is method chaining in pandas? (advanced) - Explain the working of the cut() function in pandas. (medium) - How do you handle duplicate values in a DataFrame using pandas? (medium) - What is the purpose of the nunique() function in pandas? (medium) - How can you handle time series data in pandas? (advanced) - Explain the concept of multi-indexing in pandas. (advanced) - How do you filter rows in a DataFrame based on a condition in pandas? (medium) - What is the role of the read_csv() function in pandas? (basic) - How can you export a DataFrame to a CSV file using pandas? (basic) - What is the purpose of the describe() function in pandas? (basic) - How do you handle categorical data in pandas? (medium) - Explain the role of the loc and iloc functions in pandas. (medium) - How do you perform text data analysis using pandas? (advanced) - What is the significance of the to_datetime() function in pandas? (medium)

Prepare and Apply Confidently

As you explore pandas jobs in India, remember to enhance your skills, stay updated with industry trends, and practice answering interview questions to increase your chances of securing a rewarding career in data analysis. Best of luck on your job search journey!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies