Home
Jobs
Companies
Resume

81963 Python Jobs - Page 50

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: β‚Ή0
Max: β‚Ή10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 years

0 Lacs

India

Remote

Linkedin logo

About Company: Turing is a US-based AI infrastructure company focused on accelerating the development and deployment of AI systems. It connects companies with AI experts and provides end-to-end AI solutions, initially known for its Intelligent Talent Cloud and now offering services for designing, training, and deploying advanced AI systems. Turing also focuses on advancing frontier AI model capabilities and building real-world AI applications. Job Description Job Title : Python Developer With Test Driven Development (TDD) Location : India (Remote) Experience : 6+ yrs. Employment Type: Contract to hire Work Mode : WFH Notice Period : Immediate joiners Note: Candidate should be comfortable to work for US Shifts/Night Shifts Interview Mode: Virtual (Two rounds of interviews (60 min technical + 30 min technical & cultural discussion) Client: Turing Roles and Responsibilities: Implementing real-world services as modular, production-ready APIs. Creating and maintaining excellent, developer-friendly documentation. Following a strict Test Driven Development approach β€” tests first, code second. Building at maximum velocity without compromising on reliability or maintainability. Designing and iterating on scalable database schemas for each service integration. Requirements: 5+ years of experience with practical, production-grade Python. Hands-on experience with Test Driven Development (TDD). Proven ability to build and scale large systems at high velocity. Strong fundamentals in database schema design and data modelling. Able to work at a rapid pace without sacrificing clarity or correctness. [Bonus] Familiarity with LLM function calling protocols and paradigms. Show more Show less

Posted 1 day ago

Apply

6.0 years

0 Lacs

India

On-site

Linkedin logo

Job Title: SAP Performance Engineer Experience: 6-14 years Locations: Chennai, Hyderabad, Bangalore, Kolkata, Pune Employment Type: Full-time Notice Period: Immediate Joiner Key Responsibilities: Conduct performance testing using tools like JMeter, LoadRunner, Gatling. Monitor applications using AppDynamics, New Relic, Splunk, Dynatrace . Analyze AWR reports, heap/thread dumps, JVM tuning for optimization. Collaborate with dev teams to identify & resolve performance bottlenecks . Implement performance tuning strategies for SAP & other enterprise applications. Optimize infrastructure & application performance in on-prem/cloud (AWS/Azure/GCP) . Required Skills: 6+ years in Performance Engineering + SAP . Expertise in performance testing & monitoring tools . Strong analytical skills for performance diagnostics & tuning . Knowledge of scripting (Python/Java/JS) is a plus. Cloud experience & certifications are advantageous. Show more Show less

Posted 1 day ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

πŸ“Š Data Analytics Intern – Remote | Transform Data Into Powerful Insights Are you curious about how businesses use data to make smart decisions? Do you want to explore data tools, build dashboards, and develop real analytical skills? Join Skillfied Mentor as a Data Analytics Intern and start your journey into the world of data. πŸ“ Location: Remote / Virtual πŸ’Ό Job Type: Internship (Unpaid) πŸ•’ Schedule: Flexible working hours 🌟 About the Internship: As a Data Analytics Intern , you’ll work with real datasets, solve actual business problems, and gain experience using tools and techniques that are in high demand across industries. Perform data cleaning, processing, and basic statistical analysis Build dashboards and visualizations using Excel, SQL, and Power BI/Tableau Interpret patterns and trends to generate insights Collaborate with mentors and other interns in a remote team setting Gain exposure to real-world project work and reporting πŸ” You’re a Great Fit If You: βœ… Are interested in data, analytics, and business intelligence βœ… Want to learn tools like Excel, SQL, Tableau, or Python (optional) βœ… Have no prior experience but are motivated to grow βœ… Can dedicate 5–7 hours per week (flexibly) βœ… Are proactive and comfortable working remotely 🎁 What You’ll Gain: πŸ“œ Certificate of Completion πŸ“‚ Portfolio Projects for Your Resume 🧠 Practical Data Analytics Experience πŸ“ˆ Improved Skills in Visualization, Reporting, and Decision Support ⏳ Last Date to Apply: 20th June 2025 Whether you’re a student, recent graduate, or switching careers β€” this internship gives you the foundation to build your future in Data Analytics . πŸ‘‰ Apply now and unlock the power of data with Skillfied Mentor. Show more Show less

Posted 1 day ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

Job Title: AWS Lead Engineer Location: Remote Employment Type: Full-time About the Role: We are seeking an AWS DevOps Engineer to design, deploy, and optimize a real-time data streaming platform on AWS. You will work with cutting-edge cloud technologies, ensuring scalability, security, and high performance using Kubernetes, Terraform, CI/CD, and monitoring tools . Key Responsibilities: βœ” Design & maintain AWS-based streaming solutions (Lambda, S3, RDS, VPC) βœ” Manage Kubernetes (EKS) – Helm, ArgoCD, IRSA βœ” Implement Infrastructure as Code (Terraform) βœ” Automate CI/CD pipelines ( GitHub Actions ) βœ” Monitor & troubleshoot using Datadog/Splunk βœ” Ensure security best practices ( Snyk, Sonar Cloud ) βœ” Collaborate with teams to integrate data products Must-Have Skills: πŸ”Ή AWS (IAM, Lambda, S3, VPC, CloudWatch) πŸ”Ή Kubernetes (EKS) & Helm/ArgoCD πŸ”Ή Terraform (IaC) πŸ”Ή CI/CD (GitHub Actions) πŸ”Ή Datadog/Splunk Monitoring πŸ”Ή Docker & Python/Go Scripting Nice-to-Have: πŸ”Έ AWS Certifications (DevOps/Solutions Architect) πŸ”Έ Splunk/SDLC experience Why Join Us? Work with modern cloud & DevOps tools Collaborative & innovative team Growth opportunities in AWS & DevOps Show more Show less

Posted 1 day ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

πŸ“Š Data Analyst – Remote | Step Into the Future of Analytics Are you ready to dive into the world of data and make sense of the numbers that drive business decisions? Whether you're a curious learner or looking to kickstart a career in analytics, this internship is designed just for you! πŸ“ Location: Remote / Virtual πŸ’Ό Job Type: Internship (Unpaid) πŸ•’ Schedule: Flexible working hours 🌟 What’s In It for You? Join Skillfied Mentor , where learning meets action. As a Data Analyst , you’ll work on real projects, learn essential tools, and develop job-ready skills that will help shape your career in analytics. πŸ”Ή Work on real-world datasets and business problems πŸ”Ή Learn tools like Excel, SQL, Power BI/Tableau, and Python (optional) πŸ”Ή Build core skills in data cleaning, visualization, and basic statistics πŸ”Ή Collaborate with a remote team and gain valuable teamwork experience πŸ” You’re a Great Fit If You: βœ… Enjoy working with numbers, data, and patterns βœ… Are eager to explore data tools like Excel, SQL, or Tableau βœ… Have no prior experience but are willing to learn βœ… Can contribute 5–7 hours per week (flexible) βœ… Work well independently in a virtual environment 🎁 What You’ll Gain: πŸ“œ Certificate of Completion πŸ“‚ Real Portfolio Projects 🧠 Practical Skills & Hands-on Experience ⏳ Last Date to Apply: 20th June 2025 Whether you're a student, fresher, or looking to switch careers β€” this internship offers a strong start in Data Analyst . πŸ‘‰ Apply now and begin your journey with Skillfied Mentor. Show more Show less

Posted 1 day ago

Apply

10.0 years

0 Lacs

India

Remote

Linkedin logo

Job Title: Senior Backend Engineer – Python & Microservices Location: Remote Experience Required: 8–10+ years πŸš€ About the Role: We’re looking for a Senior Backend Engineer (Python & Microservices) to join a high-impact engineering team focused on building scalable internal tools and enterprise SaaS platforms. You'll play a key role in designing cloud-native services, leading microservices architecture, and collaborating closely with cross-functional teams in a fully remote environment. πŸ”§ Responsibilities: Design and build scalable microservices using Python (Flask, FastAPI, Django) Develop production-grade RESTful APIs and background job systems Architect modular systems and drive microservice decomposition Manage SQL & NoSQL data models (PostgreSQL, MongoDB, DynamoDB, ClickHouse) Implement distributed data pipelines using Kafka, RabbitMQ, and SQS Apply best practices in rate limiting, security, performance optimisation, logging, and observability (Grafana, Datadog, CloudWatch) Deploy services in cloud environments (AWS preferred, Azure/GCP acceptable) using Docker, Kubernetes, and EKS Contribute to CI/CD and Infrastructure as Code (Jenkins, Terraform, GitHub Actions) βœ… Requirements: 8–10+ years of hands-on backend development experience Strong proficiency in Python (Flask, FastAPI, Django, etc.) Solid experience with microservices and containerised environments (Docker, Kubernetes, EKS) Expertise in REST API design, rate limiting, and performance tuning Familiarity with SQL & NoSQL (PostgreSQL, MongoDB, DynamoDB, ClickHouse) Experience with cloud platforms (AWS preferred; Azure/GCP also considered) CI/CD and IaC knowledge (GitHub Actions, Jenkins, Terraform) Exposure to distributed systems and event-based architectures (Kafka, SQS) Excellent written and verbal communication skills 🎯 Preferred Qualifications: Bachelor’s or Master’s degree in Computer Science or a related field Certifications in Cloud Architecture or System Design Experience integrating with tools like Zendesk, Openfire, or similar chat/ticketing platforms Show more Show less

Posted 1 day ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

πŸ” Job Title: Data Analyst Intern 🏒 Company: TechKnowledgeHub.org 🌐 Location: Remote (India) πŸ•’ Internship Type: Part-time / Full-time πŸ“… Duration: 3 to 6 Months πŸ’Ό Stipend: Performance-based (Details discussed during interview) About Us: TechKnowledgeHub.org is a leading online learning platform empowering students and professionals with industry-relevant technical skills. We offer instructor-led training, real-world projects, and career-focused mentorship in the fields of Data Science, AI/ML, Cloud Computing, and more. About the Internship: We are seeking a Data Analyst Intern who is passionate about data and wants to gain real-world experience in data analytics and visualization. You will work with our data team to support live projects, analyze training trends, and contribute to strategic decisions. Key Responsibilities: Collect, clean, and analyze data from multiple sources Prepare dashboards and visual reports using Excel, Power BI, or Google Data Studio Perform exploratory data analysis (EDA) using Python, SQL, or R Identify patterns and insights to support business and academic decisions Assist in monitoring learner progress and engagement metrics Document findings and present actionable recommendations Qualifications: Students pursuing a Bachelor's/Master’s degree in Data Science, Computer Science, Statistics, or related fields Familiarity with Excel, SQL, and Python or R Understanding of basic statistics and data visualization concepts Strong communication, analytical, and problem-solving skills Eagerness to learn and grow in the field of analytics What We Offer: Remote internship with flexible working hours Hands-on experience with real-world projects 1:1 mentorship from industry experts Internship Certificate and LinkedIn recommendation (for top performers) Opportunity for Pre-Placement Offer (PPO) for exceptional interns How to Apply: Submit your resume and a brief statement of interest via: πŸ“§ resume@techknowledgehub.org Subject Line: Application – Data Analyst Intern – [Your Name] Note: This role is open to applicants located in India. TechKnowledgeHub.org is an equal opportunity employer and encourages diversity and inclusion in the workplace. Show more Show less

Posted 1 day ago

Apply

0 years

0 Lacs

India

On-site

Linkedin logo

Proficiency in ML frameworks (e.g., TensorFlow, PyTorch). Experience with natural language processing (NLP) and large language models (LLMs). Understanding of generative models and their applications. Proficiency in programming languages such as Python, Go, or Java. Experience in developing APIs and integrating AI models into existing systems. Familiarity with containerization tools (e.g., Docker, Kubernetes). Experience with (CI/CD) pipelines and databases or data lakes, and/or real-time data processing frameworks. Show more Show less

Posted 1 day ago

Apply

5.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

Role Description Role Proficiency: Act under guidance of Lead II/Architect understands customer requirements and translate them into design of new DevOps (CI/CD) components. Capable of managing at least 1 Agile Team Outcomes: Interprets the DevOps Tool/feature/component design to develop/support the same in accordance with specifications Adapts existing DevOps solutions and creates own DevOps solutions for new contexts Codes debugs tests documents and communicates DevOps development stages/status of DevOps develop/support issues Select appropriate technical options for development such as reusing improving or reconfiguration of existing components Optimises efficiency cost and quality of DevOps process tools and technology development Validates results with user representatives; integrates and commissions the overall solution Helps Engineers troubleshoot issues that are novel/complex and are not covered by SOPs Design install configure troubleshoot CI/CD pipelines and software Able to automate infrastructure provisioning on cloud/in-premises with the guidance of architects Provides guidance to DevOps Engineers so that they can support existing components Work with diverse teams with Agile methodologies Facilitate saving measures through automation Mentors A1 and A2 resources Involved in the Code Review of the team Measures Of Outcomes: Quality of deliverables Error rate/completion rate at various stages of SDLC/PDLC # of components/reused # of domain/technology certification/ product certification obtained SLA for onboarding and supporting users and tickets Outputs Expected: Automated components : Deliver components that automat parts to install components/configure of software/tools in on premises and on cloud Deliver components that automate parts of the build/deploy for applications Configured Components: Configure a CI/CD pipeline that can be used by application development/support teams Scripts: Develop/Support scripts (like Powershell/Shell/Python scripts) that automate installation/configuration/build/deployment tasks Onboard Users: Onboard and extend existing tools to new app dev/support teams Mentoring: Mentor and provide guidance to peers Stakeholder Management: Guide the team in preparing status updates keeping management updated about the status Training/SOPs : Create Training plans/SOPs to help DevOps Engineers with DevOps activities and in onboarding users Measure Process Efficiency/Effectiveness: Measure and pay attention to efficiency/effectiveness of current process and make changes to make them more efficiently and effectively Stakeholder Management: Share the status report with higher stakeholder Skill Examples: Experience in the design installation configuration and troubleshooting of CI/CD pipelines and software using Jenkins/Bamboo/Ansible/Puppet /Chef/PowerShell /Docker/Kubernetes Experience in Integrating with code quality/test analysis tools like Sonarqube/Cobertura/Clover Experience in Integrating build/deploy pipelines with test automation tools like Selenium/Junit/NUnit Experience in Scripting skills (Python/Linux/Shell/Perl/Groovy/PowerShell) Experience in Infrastructure automation skill (ansible/puppet/Chef/Powershell) Experience in repository Management/Migration Automation – GIT/BitBucket/GitHub/Clearcase Experience in build automation scripts – Maven/Ant Experience in Artefact repository management – Nexus/Artifactory Experience in Dashboard Management & Automation- ELK/Splunk Experience in configuration of cloud infrastructure (AWS/Azure/Google) Experience in Migration of applications from on-premises to cloud infrastructures Experience in Working on Azure DevOps/ARM (Azure Resource Manager)/DSC (Desired State Configuration)/Strong debugging skill in C#/C Sharp and Dotnet Setting and Managing Jira projects and Git/Bitbucket repositories Skilled in containerization tools like Docker/Kubernetes Knowledge Examples: Knowledge of Installation/Config/Build/Deploy processes and tools Knowledge of IAAS - Cloud providers (AWS/Azure/Google etc.) and their tool sets Knowledge of the application development lifecycle Knowledge of Quality Assurance processes Knowledge of Quality Automation processes and tools Knowledge of multiple tool stacks not just one Knowledge of Build Branching/Merging Knowledge about containerization Knowledge on security policies and tools Knowledge of Agile methodologies Additional Comments: Experience preferred: 5+ Years Language: Must have expert knowledge of either Go or Java and have some knowledge of two others. Go Java Python C programming & Golang(Basic knowledge) Infra: Brokers: Must have some experience and preferably mastery in at least one product. We use RabbitMQ and MQTT (Mosquitto). Prefer experience with edge deployments of brokers because the design perspective is different when it comes to persistence, hardware, and telemetry Linux Shell/Scripting Docker Kubernetes k8s – Prefer experience with Edge deployments, must have some mastery in this area or in Docker K3s (nice-to-have) Tooling: Gitlab CI/CD Automation Dashboard building – In any system, someone who can take raw data and make something presentable and usable for production support Nice to have: Ansible Terraform Responsibilities: KTLO activities for existing RabbitMQ and MQTT instances including annual PCI, patching and upgrades, monitoring library upgrades of applications, production support, etc. Project work for RabbitMQ and MQTT instances including: Library enhancements - In multiple languages Security enhancements – Right now, we are setting up the hardened cluster including all of the security requested changes - Telemetry, monitoring, dashboarding, reporting. Skills Java,Devops,Rabbitmq Show more Show less

Posted 1 day ago

Apply

10.0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

Introduction We are looking for candidates with 10 +years of experience in data architect role. Responsibilities include: Design and implement scalable, secure, and cost-effective data architectures using GCP Lead the design and development of data pipelines with BigQuery, Dataflow, and Cloud Storage Architect and implement data lakes, data warehouses, and real-time data processing solutions on GCP Ensure data architecture aligns with business goals, governance, and compliance requirements Collaborate with stakeholders to define data strategy and roadmap Design and deploy BigQuery solutions for optimized performance and cost efficiency Build and maintain ETL/ELT pipelines for large-scale data processing Leverage Cloud Pub/Sub, Dataflow, and Cloud Functions for real-time data integration Implement best practices for data security, privacy, and compliance in cloud environments Integrate machine learning workflows with data pipelines and analytics tools Define data governance frameworks and manage data lineage Lead data modeling efforts to ensure consistency, accuracy, and performance across systems Optimize cloud infrastructure for scalability, performance, and reliability Mentor junior team members and ensure adherence to architectural standards Collaborate with DevOps teams to implement Infrastructure as Code (Terraform, Cloud Deployment Manager). Ensure high availability and disaster recovery solutions are built into data systems Conduct technical reviews, audits, and performance tuning for data solutions Design solutions for multi-region and multi-cloud data architecture Stay updated on emerging technologies and trends in data engineering and GCP Drive innovation in data architecture, recommending new tools and services on GCP Certifications : Google Cloud Certification is Preferred Primary Skills : 7+ years of experience in data architecture, with at least 3 years in GCP environments Expertise in BigQuery, Cloud Dataflow, Cloud Pub/Sub, Cloud Storage, and related GCP services Strong experience in data warehousing, data lakes, and real-time data pipelines Proficiency in SQL, Python, or other data processing languages Experience with cloud security, data governance, and compliance frameworks Strong problem-solving skills and ability to architect solutions for complex data environments Google Cloud Certification (Professional Data Engineer, Professional Cloud Architect) preferred Leadership experience and ability to mentor technical teams Excellent communication and collaboration skills Show more Show less

Posted 1 day ago

Apply

10.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

Introduction We are looking for candidates with 10 +years of experience in data architect role. Responsibilities include: Design and implement scalable, secure, and cost-effective data architectures using GCP Lead the design and development of data pipelines with BigQuery, Dataflow, and Cloud Storage Architect and implement data lakes, data warehouses, and real-time data processing solutions on GCP Ensure data architecture aligns with business goals, governance, and compliance requirements Collaborate with stakeholders to define data strategy and roadmap Design and deploy BigQuery solutions for optimized performance and cost efficiency Build and maintain ETL/ELT pipelines for large-scale data processing Leverage Cloud Pub/Sub, Dataflow, and Cloud Functions for real-time data integration Implement best practices for data security, privacy, and compliance in cloud environments Integrate machine learning workflows with data pipelines and analytics tools Define data governance frameworks and manage data lineage Lead data modeling efforts to ensure consistency, accuracy, and performance across systems Optimize cloud infrastructure for scalability, performance, and reliability Mentor junior team members and ensure adherence to architectural standards Collaborate with DevOps teams to implement Infrastructure as Code (Terraform, Cloud Deployment Manager). Ensure high availability and disaster recovery solutions are built into data systems Conduct technical reviews, audits, and performance tuning for data solutions Design solutions for multi-region and multi-cloud data architecture Stay updated on emerging technologies and trends in data engineering and GCP Drive innovation in data architecture, recommending new tools and services on GCP Certifications : Google Cloud Certification is Preferred Primary Skills : 7+ years of experience in data architecture, with at least 3 years in GCP environments Expertise in BigQuery, Cloud Dataflow, Cloud Pub/Sub, Cloud Storage, and related GCP services Strong experience in data warehousing, data lakes, and real-time data pipelines Proficiency in SQL, Python, or other data processing languages Experience with cloud security, data governance, and compliance frameworks Strong problem-solving skills and ability to architect solutions for complex data environments Google Cloud Certification (Professional Data Engineer, Professional Cloud Architect) preferred Leadership experience and ability to mentor technical teams Excellent communication and collaboration skills Show more Show less

Posted 1 day ago

Apply

0 years

0 Lacs

Dehradun, Uttarakhand, India

On-site

Linkedin logo

University: Swiss Federal Institute for Forest, Snow and Landscape Research (WSL) / ETH Zurich Country: Switzerland Deadline: Not specified Fields: Hydrology, Environmental Sciences, Climate Sciences, Atmospheric Sciences, Data Science The WSL Institute for Snow and Avalanche Research SLF, affiliated with the Swiss Federal Institute for Forest, Snow and Landscape Research WSL and the ETH Domain, invites applications for a postdoctoral position in hydrological modelling and climate impact assessments of extreme events. The successful candidate will join the Hydrology and Climate Impacts in Mountain Regions group, working primarily on the β€œRhΓ΄ne floods in a changing climate (RhoClim)” project, funded by the Swiss Canton of Valais, as well as a project of their own choosing related to hydrological extremes in mountain regions. Key Responsibilities Include – Setting up and running a hydrological model for the RhΓ΄ne river – Quantifying future changes in flood hazard and associated uncertainties – Investigating the hydro-meteorological drivers of changes in hydrological extremes – Conducting and publishing research in scientific journals and presenting findings at international conferences – Collaborating with researchers at the Institute for Atmospheric and Climate Science, ETH Zurich Requirements – PhD degree in hydrology, environmental sciences, climate sciences, or a closely related field – Strong programming skills (e.g., R or Python) – Experience in statistics, hydrological modeling, and data science – A strong publication record relative to career stage – Research interests in climate change and hydrology – Excellent oral and written communication skills in English; knowledge of a national language is an asset – Motivation to work in an interdisciplinary and international environment, particularly in mountain regions Application Procedure Interested candidates should submit a complete application, including a cover letter, CV, certificate and transcript of highest degree earned, and an example of scientific writing in English, via the WSL/SLF application portal: https://apply.refline.ch/273855/1743/pub/1/index.html. Applications submitted by email will not be considered. For further information, please contact Manuela Brunner at +41 81 417 03 42 or manuela.brunner@slf.ch. WSL is committed to diversity, inclusion, and gender equality, and fosters an open, inclusive work environment. Get the latest openings in your field and preferred countryβ€”straight to your email inbox. Sign up now for 14 days free: https://phdfinder.com/register Show more Show less

Posted 1 day ago

Apply

5.0 years

0 Lacs

Delhi, India

Remote

Linkedin logo

Work Level : Middle Management Core : Time Management, Result Driven, Self Motivated Leadership : Active Listening, Alignment to Organisation Goals Industry Type : IT Services & Consulting Function : Full Stack Developer Key Skills : Python,HTML,CSS,Javascript,React JS,Angular Education : Graduate Note: This is a requirement for one of the Workassist Hiring Partner. About the Role: We are seeking a highly motivated Full Stack Python Developer with 3–5 years of experience to join our remote engineering team. The ideal candidate is comfortable working across the backend and frontend, has a solid understanding of software development best practices, and thrives in a fast-paced environment. Key Responsibilities: Design, develop, and maintain scalable web applications using Python frameworks (Django/Flask/FastAPI). Build responsive user interfaces using modern JavaScript frameworks (React, Vue.js, or Angular). Collaborate with product managers, designers, and other developers to deliver end-to-end solutions. Write clean, testable, and maintainable code. Integrate third-party APIs and work with RESTful services. Ensure optimal performance, quality, and responsiveness of applications. Participate in code reviews, sprint planning, and regular team meetings. Troubleshoot and debug issues across the stack. Company Description Workassist is an online recruitment and employment solution platform based in Lucknow, India. We provide relevant profiles to employers and connect job seekers with the best opportunities across various industries. With a network of over 10,000+ recruiters, we help employers recruit talented individuals from sectors such as Banking & Finance, Consulting, Sales & Marketing, HR, IT, Operations, and Legal. We have adapted to the new normal and strive to provide a seamless job search experience for job seekers worldwide. Our goal is to enhance the job seeking experience by leveraging technology and matching job seekers with the right employers. For a seamless job search experience, visit our website: https://bit.ly/3QBfBU2 (Note: There are many more opportunities apart from this on the portal. Depending on the skills, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 1 day ago

Apply

2.0 years

0 Lacs

Mohali, Punjab

On-site

Indeed logo

Company: Chicmic Studios Job Role: Python Developer Experience: 2+ Years Job Description: We are looking for a highly skilled and experienced Python Developer to join our dynamic team. The ideal candidate will have a robust background in developing web applications using Django and Flask, with experience in deploying and managing applications on AWS. Proficiency in Django Rest Framework (DRF) and a solid understanding of machine learning concepts and their practical applications are essential. Key Responsibilities: Develop and maintain web applications using Django and Flask frameworks. Design and implement RESTful APIs using Django Rest Framework (DRF). Deploy, manage, and optimize applications on AWS. Develop and maintain APIs for AI/ML models and integrate them into existing systems. Create and deploy scalable AI and ML models using Python. Ensure the scalability, performance, and reliability of applications. Write clean, maintainable, and efficient code following best practices. Perform code reviews and provide constructive feedback to peers. Troubleshoot and debug applications, identifying and fixing issues in a timely manner. Stay up-to-date with the latest industry trends and technologies to ensure our applications remain current and competitive. Required Skills and Qualifications: Bachelor’s degree in Computer Science, Engineering, or a related field. 2+ years of professional experience as a Python Developer. Proficient in Python with a strong understanding of its ecosystem. Extensive experience with Django and Flask frameworks. Hands-on experience with AWS services, including but not limited to EC2, S3, RDS, Lambda, and Cloud Formation. Strong knowledge of Django Rest Framework (DRF) for building APIs. Experience with machine learning libraries and frameworks, such as scikit-learn, Tensor Flow, or PyTorch. Solid understanding of SQL and NoSQL databases (e.g., PostgreSQL, MongoDB). Familiarity with front-end technologies (e.g., JavaScript, HTML, CSS) is a plus. Excellent problem-solving skills and the ability to work independently and as part of a team. Strong communication skills and the ability to articulate complex technical concepts to non-technical stakeholders. Contact : 9875952836 Office Location: F273, Phase 8b Industrial Area Mohali, Punjab Job Type: Full-time Location Type: In-person Schedule: Day shift Monday to Friday Work Location: In person

Posted 1 day ago

Apply

3.0 years

0 Lacs

Mohali, Punjab

On-site

Indeed logo

Chicmic Studios Job Role: Data Scientist Experience Required: 3+ Years Skills Required: Data Science, Python, Pandas, Matplotlibs Job Description: We are seeking a Data Scientist with strong expertise in data analysis, machine learning, and visualization. The ideal candidate should be proficient in Python, Pandas, and Matplotlib, with experience in building and optimizing data-driven models. Some experience in Natural Language Processing (NLP) and Named Entity Recognition (NER) models would be a plus. Roles & Duties: Analyze and process large datasets using Python and Pandas. Develop and optimize machine learning models for predictive analytics. Create data visualizations using Matplotlib and Seaborn to support decision-making. Perform data cleaning, feature engineering, and statistical analysis. Work with structured and unstructured data to extract meaningful insights. Implement and fine-tune NER models for specific use cases (if required). Collaborate with cross-functional teams to drive data-driven solutions Required Skills & Qualifications: Strong proficiency in Python and data science libraries (Pandas, NumPy, Scikit-learn, etc.). Experience in data analysis, statistical modeling, and machine learning. Hands-on expertise in data visualization using Matplotlib and Seaborn. Understanding of SQL and database querying. Familiarity with NLP techniques and NER models is a plus. Strong problem-solving and analytical skills. Contact: 9875952836 Office Address: F273, Phase 8B industrial Area, Mohali, Punjab. Job Type: Full-time Schedule: Day shift Monday to Friday Work Location: In person

Posted 1 day ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Title - S&C Global Network - AI - CDP - Marketing Analytics - Analyst Management Level: 11-Analyst Location: Bengaluru, BDC7C Must-have skills: Data Analytics Good to have skills: Ability to leverage design thinking, business process optimization, and stakeholder management skills. Job Summary: This role involves driving strategic initiatives, managing business transformations, and leveraging industry expertise to create value-driven solutions. Roles & Responsibilities: Provide strategic advisory services, conduct market research, and develop data-driven recommendations to enhance business performance. WHAT’S IN IT FOR YOU? As part of our Analytics practice, you will join a worldwide network of over 20k+ smart and driven colleagues experienced in leading AI/ML/Statistical tools, methods and applications. From data to analytics and insights to actions, our forward-thinking consultants provide analytically-informed, issue-based insights at scale to help our clients improve outcomes and achieve high performance. What You Would Do In This Role A Consultant/Manager for Customer Data Platforms serves as the day-to-day marketing technology point of contact and helps our clients get value out of their investment into a Customer Data Platform (CDP) by developing a strategic roadmap focused on personalized activation. You will be working with a multidisciplinary team of Solution Architects, Data Engineers, Data Scientists, and Digital Marketers. Key Duties and Responsibilities: Be a platform expert in one or more leading CDP solutions. Developer level expertise on Lytics, Segment, Adobe Experience Platform, Amperity, Tealium, Treasure Data etc. Including custom build CDPs Deep developer level expertise for real time even tracking for web analytics e.g., Google Tag Manager, Adobe Launch etc. Provide deep domain expertise in our client’s business and broad knowledge of digital marketing together with a Marketing Strategist industry Deep expert level knowledge of GA360/GA4, Adobe Analytics, Google Ads, DV360, Campaign Manager, Facebook Ads Manager, The Trading desk etc. Assess and audit the current state of a client’s marketing technology stack (MarTech) including data infrastructure, ad platforms and data security policies together with a solutions architect. Conduct stakeholder interviews and gather business requirements Translate business requirements into BRDs, CDP customer analytics use cases, structure technical solution Prioritize CDP use cases together with the client. Create a strategic CDP roadmap focused on data driven marketing activation. Work with the Solution Architect to strategize, architect, and document a scalable CDP implementation, tailored to the client’s needs. Provide hands-on support and platform training for our clients. Data processing, data engineer and data schema/models expertise for CDPs to work on data models, unification logic etc. Work with Business Analysts, Data Architects, Technical Architects, DBAs to achieve project objectives - delivery dates, quality objectives etc. Business intelligence expertise for insights, actionable recommendations. Project management expertise for sprint planning Professional & Technical Skills: Relevant experience in the required domain. Strong analytical, problem-solving, and communication skills. Ability to work in a fast-paced, dynamic environment. Strong understanding of data governance and compliance (i.e. PII, PHI, GDPR, CCPA) Experience with analytics tools like Google Analytics or Adobe Analytics is a plus. Experience with A/B testing tools is a plus. Must have programming experience in PySpark, Python, Shell Scripts. RDBMS, TSQL, NoSQL experience is must. Manage large volumes of structured and unstructured data, extract & clean data to make it amenable for analysis. Experience in deployment and operationalizing the code is an added advantage. Experience with source control systems such as Git, Bitbucket, and Jenkins build and continuous integration tools. Proficient in Excel, MS word, PowerPoint, etc Technical Skills: Any CDP platforms experience e.g., Lytics CDP platform developer, or/and Segment CDP platform developer, or/and Adobe Experience Platform (Real time – CDP) developer, or/and Custom CDP developer on any cloud GA4/GA360, or/and Adobe Analytics Google Tag Manager, and/or Adobe Launch, and/or any Tag Manager Tool Google Ads, DV360, Campaign Manager, Facebook Ads Manager, The Trading desk etc. Deep Cloud experiecne (GCP, AWS, Azure) Advance level Python, SQL, Shell Scripting experience Data Migration, DevOps, MLOps, Terraform Script programmer Soft Skills: Strong problem solving skills Good team player Attention to details Good communication skills Additional Information: Opportunity to work on innovative projects. Career growth and leadership exposure. About Our Company | Accenture Experience: 3-5Years Educational Qualification: Any Degree Show more Show less

Posted 1 day ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Description Some careers shine brighter than others. If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Consultant Specialist. In this role, you will: Develop and maintain scalable web application using Java , spring boot and hibernate. Design and implement RESTful APIs and micro services Develop front-end components using Angular or React Work with databases like MySQL for Data storage and management Ensure application security, performance, and scalability. Troubleshoot , debug, and optimize Application performance Create functional specifications , use cases, process flow. Strong analytical and problem-solving abilities. Requirements To be successful in this role, you must meet the following requirements: Back end : Java, Spring Boot, Maven Hibernate, Micro services. Front-end: Angular, React , JavaScript, HTML , CSS, JSP Database: MYSQL/ PostgreSql / Oracle Others: Restful APIs, Agile Methodologies, Testing framework Basics of Python programming language Debugging Python code Good knowledge Docker, Kubernetes Good knowledge of test automation Good knowledge of Jenkins Pipelines, JIRA and Confluence Gather and document business requirements through stakeholder meetings, interviews, and workshops. Experience in requirement gathering , process modeling and documentation. proficiency in tools such as JIRA, Confluence Power BI or SQL. ESSENTIAL SKILLS (non-technical) Excellent communication skills Ability to explain complex ideas Ability to work as part of a team Ability to work in a team that is located across multiple regions / time zones Willingness to adapt and learn new things Willingness to take ownership of tasks Strong collaboration skills and experience working in diverse, global teams. Excellent problem-solving skills and ability to work independently and as part of a team. You’ll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by – HSBC Software Development India Show more Show less

Posted 1 day ago

Apply

9.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

About Client: Our Client is a global IT services company headquartered in Southborough, Massachusetts, USA. Founded in 1996, with a revenue of $1.8B, with 35,000+ associates worldwide, specializes in digital engineering, and IT services company helping clients modernize their technology infrastructure, adopt cloud and AI solutions, and accelerate innovation. It partners with major firms in banking, healthcare, telecom, and media. Our Client is known for combining deep industry expertise with agile development practices, enabling scalable and cost-effective digital transformation. The company operates in over 50 locations across more than 25 countries, has delivery centers in Asia, Europe, and North America and is backed by Baring Private Equity Asia. Job Title : Python with React Developer Experience Level : 9-16 Years Job Location : PAN India Budget : 1,80,000 Per Month Job Type : Contract Work Mode : Hybrid Notice Period : Immediate Joiners Client : CMMI Level 5 Responsibilities Develop and maintain web applications using React for the front end and Python for the back end. Design and implement user interfaces and components. Write clean, efficient, and well-documented code. Integrate front-end components with back-end APIs. Troubleshoot and debug issues. Optimize applications for performance and scalability. Collaborate with other developers, designers, and stakeholders. Participate in code reviews. Stay up-to-date with emerging technologies and trends. Manage multiple projects and tasks effectively. Requirements Strong proficiency in Python and experience with Python web frameworks such as Flask or Django. Solid understanding of React and its core principles. Experience with front-end technologies such as HTML, CSS, and JavaScript. Knowledge of RESTful APIs and how to integrate them. Familiarity with testing and debugging tools. Experience with version control systems such as Git. Strong problem-solving and analytical skills. Excellent communication and collaboration skills. Ability to work independently and as part of a team. Good time management skills. Experience with database systems (e.g., PostgreSQL, MySQL) is a plus. Familiarity with cloud platforms (e.g., AWS, Azure, Google Cloud) is a plus. Knowledge of CI/CD pipelines is a plus. Experience with Agile development methodologies is a plus if interested please share your update resume sridhar.g@people-prime.com Show more Show less

Posted 1 day ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Description Some careers shine brighter than others. If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Consultant Specialist. In this role, you will: Develop and maintain scalable web application using Java , spring boot and hibernate. Design and implement RESTful APIs and micro services Develop front-end components using Angular or React Work with databases like MySQL for Data storage and management Ensure application security, performance, and scalability. Troubleshoot , debug, and optimize Application performance Create functional specifications , use cases, process flow. Strong analytical and problem-solving abilities. Requirements To be successful in this role, you must meet the following requirements: Back end : Java, Spring Boot, Maven Hibernate, Micro services. Front-end: Angular, React , JavaScript, HTML , CSS, JSP Database: MYSQL/ PostgreSql / Oracle Others: Restful APIs, Agile Methodologies, Testing framework Basics of Python programming language Debugging Python code Good knowledge Docker, Kubernetes Good knowledge of test automation Good knowledge of Jenkins Pipelines, JIRA and Confluence Gather and document business requirements through stakeholder meetings, interviews, and workshops. Experience in requirement gathering , process modeling and documentation. proficiency in tools such as JIRA, Confluence Power BI or SQL. ESSENTIAL SKILLS (non-technical) Excellent communication skills Ability to explain complex ideas Ability to work as part of a team Ability to work in a team that is located across multiple regions / time zones Willingness to adapt and learn new things Willingness to take ownership of tasks Strong collaboration skills and experience working in diverse, global teams. Excellent problem-solving skills and ability to work independently and as part of a team. You’ll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by – HSBC Software Development India Show more Show less

Posted 1 day ago

Apply

0.0 - 2.0 years

0 Lacs

Mumbai, Maharashtra

On-site

Indeed logo

Summary Bitkraft Technologies LLP is looking for React Native Developer to join our software engineering team. You will be working on the ReactJS framework for creating web projects for our custom services business. As a React Native Developer, you should be comfortable with other javascript related frontend frameworks and third-party libraries. It is essential that you are passionate about good user experience - a keen eye for quality visual design would be a plus! If you love solving problems, are a team player and want to work in a fast paced environment with core technical and business challenges, we would like to meet you. Essential Skills Experience working with below technologies at an intermediate to expert level JavaScript, HTML5 and CSS React Native Other Essential Skills / Requirements Great attention to detail Experience in working in Agile projects - with familiarity with tools such Jira Ready to work on technologies, platforms and frameworks as required on future projects Strong work ethic and commitment to meet deadlines and support team members meet goals Be flexible with working across time zones with overseas customers if required Desirable Skills Frontend - Knowledge of at least one of the following Angular Vuejs Backend - Knowledge of at least one of the following NodeJs Python (Django/Flask/Jupyter Notebook) PHP (Yii2/Wordpress/Magento) Databases - Knowledge of at least one of the following MySQL Postgresql MongoDB Graph Databases Oracle Cloud Infrastructure - Knowledge of at least one of the following AWS Azure Google Cloud / Firebase Mobile Technologies - Knowledge of at least one of the following Native Android Native iOS Hybrid Mobile App Development – Ionic/Flutter/React Native Key Responsibilities Gathering & understanding client requirements and providing technical solutions to business needs Ability to resolve business problems through technology Deliver work within timelines and quality specifications Strive to develop reusable code Respond to technical queries from clients, management and team members Evaluating existing technical architecture and recommended improvements Work with development teams and product managers to ideate software solutions Design client-side and server-side architecture Perform unit testing and scenario based testing to build robust systems Troubleshoot, debug and update existing applications Create security and data protection mechanisms to ensure application security Create technical documentation, flow diagrams, use-case diagrams, charts as required Communicate with team members and clients for efficient project functioning Staying abreast of latest developments in web technologies and related programming languages/frameworks Experience - 2 to 5 years Job Location - Fort, Mumbai Why join Bitkraft? Your inputs and opinions are valued Exposure to latest technologies Working directly with client teams International project experience You get to see the big picture on the project Fast paced environment with quick project completions Manage your own time for efficient working A friendly and easy going work environment About Bitkraft Technologies LLP Bitkraft Technologies LLP is an award winning Software Engineering Consultancy focused on Enterprise Software Solutions, Mobile Apps Development, ML/AI Solution Engineering, Extended Reality, Managed Cloud Services and Technology Skill-sourcing, with an extraordinary track record. We are driven by technology and push the limits of what can be done to realise the business needs of our customers. Our team is committed towards delivering products of the highest standards and we take pride in creating robust user-driven solutions that meet business needs. Bitkraft boasts of clients across over 10+ countries including US, UK, UAE, Oman, Australia and India to name a few. To know more about Bitkraft visit our website bitkraft.co.in Job Types: Full-time, Permanent Pay: β‚Ή500,000.00 - β‚Ή1,200,000.00 per year Benefits: Paid time off Schedule: Day shift Experience: total work: 2 years (Required) React Native: 2 years (Required) Location: Mumbai, Maharashtra (Required) Work Location: In person

Posted 1 day ago

Apply

4.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Job Description: Position - Security Incident Responder Exp. - 4+ Years Location - Gurgaon ( 5 Days WFO ) Apply Here : https://forms.gle/1PVR9KTHvRaeMBuj8 Snowbit is looking for an experienced Security Incident Responder to join our Managed Detection and Response (MDR) team. This role requires expertise in incident response, threat hunting, and forensic investigations, with a strong emphasis on cloud environments and Kubernetes. You will lead efforts to protect our customers from advanced cyber threats while contributing to the continuous improvement of Snowbit’s methodologies, processes, and technology stack. What You’ll Do: Leverage Snowbit’s advanced MDR platform to lead large-scale incident response investigations and proactive threat-hunting initiatives. Conduct log analysis, and cloud artifact reviews using EDR and similar tools depending on availability, to support incident resolution and root-cause investigations. Investigate and respond to security incidents in containerized environments, with a specific focus on Kubernetes security and architecture. Research evolving cyberattack tactics, techniques, and procedures (TTPs) to strengthen customer defenses and codify insights for our services. Provide technical and executive briefings to customers, including recommendations to mitigate risk and enhance cybersecurity posture. Collaborate with internal teams, including engineering and research, to enhance Snowbit’s MDR and incident response capabilities. Partner with customer teams (IT, DevOps, and Security) to ensure seamless integration and adoption of Snowbit’s MDR services. Share expertise through presentations, research publications, and participation in the global cybersecurity community. Experience: 3-5 years in incident response, threat hunting with strong experience in cloud security (AWS, Azure, GCP) and Kubernetes environments. Proven Incident response experience in complex environments. Technical Skills: Demonstrates strong expertise in understanding adversary tactics and techniques, translating them into actionable investigation tasks, conducting in-depth analysis, and accurately assessing the impact. Familiarity with attack vectors, malware families, and campaigns. Deep understanding of network architecture, protocols, and operating system internals (Windows, Linux, Unix). Expertise in Kubernetes security, including container orchestration, workload isolation, and cluster hardening. Experience securing Kubernetes infrastructure, runtime security, and security monitoring. Problem-Solving: Ability to work independently and collaboratively in dynamic, fast-paced environments. Communication: Excellent written and verbal communication skills to interact with technical and non-technical stakeholders. Preferred Skills: Scripting skills (e.g., Python, PowerShell) Experience with Red Team operations, penetration testing, or cyber operations. Hands-on knowledge of attack frameworks (e.g., MITRE ATT&CK, Metasploit, Cobalt Strike). Proficiency in host forensics, memory forensics, and malware analysis. Show more Show less

Posted 1 day ago

Apply

5.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

What's the role? We are seeking a highly skilled and motivated Senior GIS Data Analyst/Engineer to join our innovative team in India. This role will leverage advanced expertise in GIS, data science, and programming to extract actionable insights from geospatial data, driving impactful business outcomes through cutting-edge visualization and analytical tools. Responsibilities Data Analysis and Management Conduct advanced spatial data analysis using GIS software (ArcGIS, QGIS) to derive meaningful insights. Manage, manipulate, and analyze large geospatial datasets to produce high-quality maps and actionable reports. Ensure data accuracy and integrity through rigorous quality control measures and regular audits. Programming and Automation Develop and implement Python scripts for data processing, analysis, and automation, with proficiency in SQL for querying and managing databases. Apply machine learning and AI techniques, including Generative AI, to enhance data accuracy and predictive capabilities in GIS applications. Visualization and Reporting Create compelling visualizations and interactive dashboards using Tableau, PowerBI, Matplotlib, and Seaborn to communicate complex spatial insights effectively. Leverage advanced Excel for data manipulation and reporting. Develop and maintain high-quality maps to support stakeholder presentations and decision-making. Process Efficiency and Development Design and implement efficient data analysis workflows to optimize processing and analysis tasks. Translate process efficiency concepts into development strategies, achieving significant time and effort savings. Continuously evaluate and enhance workflows to improve performance and scalability. Tool and Application Development Develop custom GIS tools and applications to address diverse business requirements. Utilize FME (Feature Manipulation Engine) for advanced data transformation and integration tasks (knowledge preferred). Look into scalable data storage, processing, and analytics. Collaboration and Support Collaborate with cross-functional teams to translate data insights into strategic business solutions. Provide technical support and training to team members on GIS tools, visualization platforms, and data analysis methodologies. Contribute to team and company objectives aligned with Business goals. Continuous Learning Stay updated on the latest advancements in GIS, data science, machine learning, Generative AI, and visualization technologies. Who are you? Bachelor’s or master’s degree in Geography, GIS, Data Science, Computer Science, or a related field. Minimum 5+ years of experience in a GIS or data analysis role. Advanced proficiency in ArcGIS and QGIS software. Strong programming skills in Python and proficiency in SQL. Proven expertise in data visualization and reporting using Tableau, PowerBI, Matplotlib, Seaborn, and advanced Excel. Hands-on experience with AWS services for data management and analytics. Familiarity with machine learning, AI, and Generative AI applications in GIS environments. Knowledge of FME (Feature Manipulation Engine) is an advantage. Exceptional analytical, problem-solving, and decision-making skills. Excellent communication, collaboration, and teamwork abilities. Ability to work independently, prioritize tasks, and manage multiple projects in a fast-paced environment. Job location: Gurgaon HERE is an equal opportunity employer. We evaluate qualified applicants without regard to race, color, age, gender identity, sexual orientation, marital status, parental status, religion, sex, national origin, disability, veteran status, and other legally protected characteristics. Who are we? HERE Technologies is a location data and technology platform company. We empower our customers to achieve better outcomes – from helping a city manage its infrastructure or a business optimize its assets to guiding drivers to their destination safely. At HERE we take it upon ourselves to be the change we wish to see. We create solutions that fuel innovation, provide opportunity and foster inclusion to improve people’s lives. If you are inspired by an open world and driven to create positive change, join us. Learn more about us on our YouTube Channel. Show more Show less

Posted 1 day ago

Apply

5.0 years

0 Lacs

Panaji, Goa

On-site

Indeed logo

Education: Bachelor’s or master’s in computer science, Software Engineering, or a related field (or equivalent practical experience). About the Role We’re creating an internal platform that turns data-heavy engineering workflowsβ€”currently spread across spreadsheets, PDFs, e-mail, and third-party portalsβ€”into streamlined, AI-assisted services. You’ll own large pieces of that build: bringing data in, automating analysis with domain–specific engines, integrating everyday business tools, and partnering with a data analyst to fine-tune custom language models. The work is hands-on and highly autonomous; you’ll design, code, deploy, and iterate features that remove manual effort for our engineering and project-management teams. What You’ll Do AI & LLM Workflows – prototype and deploy large-language-model services for document parsing, validation, and natural-language Q&A. Automation Services – build Python micro-services that convert unstructured project files into structured stores and trigger downstream calculation tools through their APIs. Enterprise Integrations – connect calendars, project-tracking portals, and document libraries via REST / Graph APIs and event streams. DevOps & Cloud – containerize workloads, write CI/CD pipelines, codify infrastructure (Terraform/CloudFormation) and keep runtime costs in check. Quality & Security – maintain tests, logging, RBAC, encryption, and safe-prompt patterns. Collaboration – document designs clearly, demo working proofs to stakeholders, and coach colleagues on AI-assisted development practices. You’ll Need 5+ years professional software-engineering experience, including 3+ years Python. Proven track record shipping AI / NLP / LLM solutions (OpenAI, Azure OpenAI, Hugging Face, or similar). Practical DevOps skills: Docker, Git, CI/CD pipelines, and at least one major cloud platform. Experience integrating external SDKs or vendor APIs (engineering, GIS, or document-management domains preferred). Strong written / verbal communication and the discipline to work independently from loosely defined requirements. Nice-to-Have Exposure to engineering or construction data (drawings, 3-D models, load calculations, etc.). Modern front-end skills (React / TypeScript) for dashboard or viewer components. Familiarity with Power Automate, Graph API, or comparable workflow tools. How We Work Autonomy + Ownership – plan your own sprints, defend technical trade-offs, own deliverables end-to-end. AI-Augmented Development – we encourage daily use of coding copilots and chat-based problem solving for speed and clarity. If you enjoy blending practical software engineering with cutting-edge AI tooling to eliminate repetitive work, we’d like to meet you. Job Types: Full-time, Permanent Pay: β‚Ή80,000.00 - β‚Ή90,000.00 per month Benefits: Health insurance Provident Fund Schedule: Day shift Monday to Friday Supplemental Pay: Yearly bonus Work Location: In person Application Deadline: 30/06/2025 Expected Start Date: 30/06/2025

Posted 1 day ago

Apply

5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

πŸ” We’re Hiring: Data Analyst – Sales & Marketing Operations πŸ•˜ Working Days : Monday to Friday (Morning Shift) 🚫 Note : No cabs/travel provided πŸ’° Salary : Up to β‚Ή7.5 LPA (Max. 35% hike on the last drawn CTC) About the Role We are seeking a highly analytical and detail-oriented Data Analyst to own the complete data management lifecycle and drive data-driven decision-making across Sales, Marketing, and Business teams. The ideal candidate will have a deep understanding of CRM systems, data hygiene practices, and marketing automation tools. Key Responsibilities: Own the end-to-end data lifecycle – hygiene, cleansing, validation, enrichment, and integration. Collaborate with Sales, Marketing, IT, and Conference teams to translate data into actionable insights. Ensure CRM data integrity in tools like HubSpot, Salesforce, Zoho . Define and manage SOPs for data acquisition and enrichment. Build reporting and dashboards using Tableau, Power BI, or Google Data Studio . Conduct market research and maintain high-quality lead databases. Use tools like LinkedIn Sales Navigator, Zoominfo, Apollo.io, Hoovers for intelligence gathering. Automate data tasks using SQL, Python, or other scripting tools . Maintain compliance with data privacy regulations (e.g., GDPR, CCPA). Drive continuous improvement through performance monitoring and insight generation. Manage drip campaigns and segmentation logic in Mailchimp, Zoho Campaigns, or HubSpot . Who You Are: 5+ years of experience in data analytics, ideally in sales/marketing operations. Proficient in Advanced Excel, SQL, Tableau/Power BI ; familiarity with HTML/CSS. Hands-on experience with HubSpot, Salesforce, Zoho CRM . Exposure to MySQL , with Python or R being a strong plus. Experienced in managing timelines and collaborating across teams. Strong problem-solving skills and a data-driven mindset. Knowledge of data governance and compliance best practices. Bonus: Exposure to machine learning or predictive analytics. Why Join Us? 5-day work week – Monday to Friday (Fixed Sat-Sun off) Work with a collaborative and data-forward team Opportunity to lead cross-functional data initiatives Competitive compensation up to β‚Ή7.5 LPA Show more Show less

Posted 1 day ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Req ID: 299670 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Systems Integration Analyst to join our team in Noida, Uttar Pradesh (IN-UP), India (IN). Position General Duties and Tasks: Participate in research, design, implementation, and optimization of Machine learning Models Help AI product managers and business stakeholders understand the potential and limitations of AI when planning new products Understanding of Revenue Cycle Management processes like Claims filing and adjudication Hands on experience in Python Build data ingest and data transformation platform Identify transfer learning opportunities and new training datasets Build AI models from scratch and help product managers and stakeholders understand results Analysing the ML algorithms that could be used to solve a given problem and ranking them by their success probability Exploring and visualizing data to gain an understanding of it, then identifying differences in data distribution that could affect performance when deploying the model in the real world Verifying data quality, and/or ensuring it via data cleaning Supervising the data acquisition process if more data is needed Defining validation strategies Defining the pre-processing or feature engineering to be done on a given dataset Training models and tuning their hyperparameters Analysing the errors of the model and designing strategies to overcome them Deploying models to production Create APIs and help business customers put results of your AI models into operations JD Education Bachelor’s in computer sciences or similar. Masters preferred. Skills hands on programming experience working on enterprise products Demonstrated proficiency in multiple programming languages with a strong foundation in a statistical platform such as Python, R, SAS, or MatLab. Knowledge in Deep Learning/Machine learning, Artificial Intelligence Experience in building AI models using algorithms of Classification & Clustering techniques Expertise in visualizing and manipulating big datasets Strong in MS SQL Acumen to take a complex problem and break it down to workable pieces, to code a solution Excellent verbal and written communication skills Ability to work in and define a fast pace and team focused environment Proven record of delivering and completing assigned projects and initiatives Ability to deploy large scale solutions to an enterprise estate Strong interpersonal skills Understanding of Revenue Cycle Management processes like Claims filing and adjudication is a plus About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA is an equal opportunity employer and considers all applicants without regarding to race, color, religion, citizenship, national origin, ancestry, age, sex, sexual orientation, gender identity, genetic information, physical or mental disability, veteran or marital status, or any other characteristic protected by law. We are committed to creating a diverse and inclusive environment for all employees. If you need assistance or an accommodation due to a disability, please inform your recruiter so that we may connect you with the appropriate team. Show more Show less

Posted 1 day ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies