Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
6.0 years
60 - 65 Lacs
Nashik, Maharashtra, India
Remote
Experience : 6.00 + years Salary : INR 6000000-6500000 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: Crop.Photo) (*Note: This is a requirement for one of Uplers' client - Crop.Photo) What do you need for this opportunity? Must have skills required: MAM, App integration Crop.Photo is Looking for: Technical Lead for Evolphin AI-Driven MAM At Evolphin, we build powerful media asset management solutions used by some of the world’s largest broadcasters, creative agencies, and global brands. Our flagship platform, Zoom, helps teams manage high-volume media workflows—from ingest to archive—with precision, performance, and AI-powered search. We’re now entering a major modernization phase, and we’re looking for an exceptional Technical Lead to own and drive the next-generation database layer powering Evolphin Zoom. This is a rare opportunity to take a critical backend system that serves high-throughput media operations and evolve it to meet the scale, speed, and intelligence today’s content teams demand. What you’ll own Leading the re-architecture of Zoom’s database foundation with a focus on scalability, query performance, and vector-based search support Replacing or refactoring our current in-house object store and metadata database to a modern, high-performance elastic solution Collaborating closely with our core platform engineers and AI/search teams to ensure seamless integration and zero disruption to existing media workflows Designing an extensible system that supports object-style relationships across millions of assets, including LLM-generated digital asset summaries, time-coded video metadata, AI generated tags, and semantic vectors Driving end-to-end implementation: schema design, migration tooling, performance benchmarking, and production rollout—all with aggressive timelines Skills & Experience We Expect We’re looking for candidates with 7–10 years of hands-on engineering experience, including 3+ years in a technical leadership role. Your experience should span the following core areas: System Design & Architecture (3–4 yrs) Strong hands-on experience with the Java/JVM stack (GC tuning), Python in production environments Led system-level design for scalable, modular AWS microservices architectures Designed high-throughput, low-latency media pipelines capable of scaling to billions of media records Familiar with multitenant SaaS patterns, service decomposition, and elastic scale-out/in models Deep understanding of infrastructure observability, failure handling, and graceful degradation Database & Metadata Layer Design (3–5 yrs) Experience redesigning or implementing object-style metadata stores used in MAM/DAM systems Strong grasp of schema-less models for asset relationships, time-coded metadata, and versioned updates Practical experience with DynamoDB, Aurora, PostgreSQL, or similar high-scale databases Comfortable evaluating trade-offs between memory, query latency, and write throughput Semantic Search & Vectors (1–3 yrs) Implemented vector search using systems like Weaviate, Pinecone, Qdrant, or Faiss Able to design hybrid (structured + semantic) search pipelines for similarity and natural language use cases Experience tuning vector indexers for performance, memory footprint, and recall Familiar with the basics of embedding generation pipelines and how they are used for semantic search and similarity-based retrieval Worked with MLOps teams to deploy ML inference services (e.g., FastAPI/Docker + GPU-based EC2 or SageMaker endpoints) Understands the limitations of recognition models (e.g., OCR, face/object detection, logo recognition), even if not directly building them Media Asset Workflow (2–4 yrs) Deep familiarity with broadcast and OTT formats: MXF, IMF, DNxHD, ProRes, H.264, HEVC Understanding of proxy workflows in video post-production Experience with digital asset lifecycle: ingest, AI metadata enrichment, media transformation, S3 cloud archiving Hands-on experience working with time-coded metadata (e.g., subtitles, AI tags, shot changes) management in media archives Cloud-Native Architecture (AWS) (3–5 yrs) Strong hands-on experience with ECS, Fargate, Lambda, S3, DynamoDB, Aurora, SQS, EventBridge Experience building serverless or service-based compute models for elastic scaling Familiarity with managing multi-region deployments, failover, and IAM configuration Built cloud-native CI/CD deployment pipelines with event-driven microservices and queue-based workflows Frontend Collaboration & React App Integration (2–3 yrs) Worked closely with React-based frontend teams, especially on desktop-style web applications Familiar with component-based design systems, REST/GraphQL API integration, and optimizing media-heavy UI workflows Able to guide frontend teams on data modeling, caching, and efficient rendering of large asset libraries Experience with Electron for desktop apps How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 14 hours ago
6.0 years
60 - 65 Lacs
Kanpur, Uttar Pradesh, India
Remote
Experience : 6.00 + years Salary : INR 6000000-6500000 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: Crop.Photo) (*Note: This is a requirement for one of Uplers' client - Crop.Photo) What do you need for this opportunity? Must have skills required: MAM, App integration Crop.Photo is Looking for: Technical Lead for Evolphin AI-Driven MAM At Evolphin, we build powerful media asset management solutions used by some of the world’s largest broadcasters, creative agencies, and global brands. Our flagship platform, Zoom, helps teams manage high-volume media workflows—from ingest to archive—with precision, performance, and AI-powered search. We’re now entering a major modernization phase, and we’re looking for an exceptional Technical Lead to own and drive the next-generation database layer powering Evolphin Zoom. This is a rare opportunity to take a critical backend system that serves high-throughput media operations and evolve it to meet the scale, speed, and intelligence today’s content teams demand. What you’ll own Leading the re-architecture of Zoom’s database foundation with a focus on scalability, query performance, and vector-based search support Replacing or refactoring our current in-house object store and metadata database to a modern, high-performance elastic solution Collaborating closely with our core platform engineers and AI/search teams to ensure seamless integration and zero disruption to existing media workflows Designing an extensible system that supports object-style relationships across millions of assets, including LLM-generated digital asset summaries, time-coded video metadata, AI generated tags, and semantic vectors Driving end-to-end implementation: schema design, migration tooling, performance benchmarking, and production rollout—all with aggressive timelines Skills & Experience We Expect We’re looking for candidates with 7–10 years of hands-on engineering experience, including 3+ years in a technical leadership role. Your experience should span the following core areas: System Design & Architecture (3–4 yrs) Strong hands-on experience with the Java/JVM stack (GC tuning), Python in production environments Led system-level design for scalable, modular AWS microservices architectures Designed high-throughput, low-latency media pipelines capable of scaling to billions of media records Familiar with multitenant SaaS patterns, service decomposition, and elastic scale-out/in models Deep understanding of infrastructure observability, failure handling, and graceful degradation Database & Metadata Layer Design (3–5 yrs) Experience redesigning or implementing object-style metadata stores used in MAM/DAM systems Strong grasp of schema-less models for asset relationships, time-coded metadata, and versioned updates Practical experience with DynamoDB, Aurora, PostgreSQL, or similar high-scale databases Comfortable evaluating trade-offs between memory, query latency, and write throughput Semantic Search & Vectors (1–3 yrs) Implemented vector search using systems like Weaviate, Pinecone, Qdrant, or Faiss Able to design hybrid (structured + semantic) search pipelines for similarity and natural language use cases Experience tuning vector indexers for performance, memory footprint, and recall Familiar with the basics of embedding generation pipelines and how they are used for semantic search and similarity-based retrieval Worked with MLOps teams to deploy ML inference services (e.g., FastAPI/Docker + GPU-based EC2 or SageMaker endpoints) Understands the limitations of recognition models (e.g., OCR, face/object detection, logo recognition), even if not directly building them Media Asset Workflow (2–4 yrs) Deep familiarity with broadcast and OTT formats: MXF, IMF, DNxHD, ProRes, H.264, HEVC Understanding of proxy workflows in video post-production Experience with digital asset lifecycle: ingest, AI metadata enrichment, media transformation, S3 cloud archiving Hands-on experience working with time-coded metadata (e.g., subtitles, AI tags, shot changes) management in media archives Cloud-Native Architecture (AWS) (3–5 yrs) Strong hands-on experience with ECS, Fargate, Lambda, S3, DynamoDB, Aurora, SQS, EventBridge Experience building serverless or service-based compute models for elastic scaling Familiarity with managing multi-region deployments, failover, and IAM configuration Built cloud-native CI/CD deployment pipelines with event-driven microservices and queue-based workflows Frontend Collaboration & React App Integration (2–3 yrs) Worked closely with React-based frontend teams, especially on desktop-style web applications Familiar with component-based design systems, REST/GraphQL API integration, and optimizing media-heavy UI workflows Able to guide frontend teams on data modeling, caching, and efficient rendering of large asset libraries Experience with Electron for desktop apps How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 14 hours ago
0 years
0 Lacs
India
Remote
Data Science Intern (Paid) Company: WebBoost Solutions by UM Location: Remote Duration: 3 months Opportunity: Full-time based on performance, with a Certificate of Internship About WebBoost Solutions by UM WebBoost Solutions by UM provides aspiring professionals with hands-on experience in data science , offering real-world projects to develop and refine their analytical and machine learning skills for a successful career. Responsibilities ✅ Collect, preprocess, and analyze large datasets. ✅ Develop predictive models and machine learning algorithms . ✅ Perform exploratory data analysis (EDA) to extract meaningful insights. ✅ Create data visualizations and dashboards for effective communication of findings. ✅ Collaborate with cross-functional teams to deliver data-driven solutions . Requirements 🎓 Enrolled in or graduate of a program in Data Science, Computer Science, Statistics, or a related field . 🐍 Proficiency in Python or R for data analysis and modeling. 🧠 Knowledge of machine learning libraries such as scikit-learn, TensorFlow, or PyTorch (preferred) . 📊 Familiarity with data visualization tools (Tableau, Power BI, or Matplotlib) . 🧐 Strong analytical and problem-solving skills. 🗣 Excellent communication and teamwork abilities. Stipend & Benefits 💰 Stipend: ₹7,500 - ₹15,000 (Performance-Based). ✔ Hands-on experience in data science projects . ✔ Certificate of Internship & Letter of Recommendation . ✔ Opportunity to build a strong portfolio of data science models and applications. ✔ Potential for full-time employment based on performance. How to Apply 📩 Submit your resume and a cover letter with the subject line "Data Science Intern Application." 📅 Deadline: 19th June 2025 Equal Opportunity WebBoost Solutions by UM is committed to fostering an inclusive and diverse environment and encourages applications from all backgrounds. Let me know if you need any modifications! 🚀 Show more Show less
Posted 14 hours ago
0 years
0 Lacs
India
Remote
🧠 Data Science Intern – Remote | Explore the World of AI & Data Are you fascinated by machine learning, data modeling, and real-world applications of AI? If you're ready to dive into the exciting world of data science, join Skillfied Mentor as a Data Science Intern and start building your future in tech. 📍 Location: Remote / Virtual 💼 Job Type: Internship (Unpaid) 🕒 Schedule: Flexible working hours 🌟 About the Internship: As a Data Science Intern , you'll get hands-on exposure to real data problems, machine learning concepts, and practical projects. This internship is designed to give you experience that matters — even without prior industry background. 🔹 Work with real datasets to build and test models 🔹 Learn tools like Python, Pandas, NumPy, Scikit-Learn, and Jupyter Notebooks 🔹 Understand the basics of machine learning and data preprocessing 🔹 Collaborate with a remote team to solve business-related challenges 🔹 Apply statistics and coding to derive data-driven solutions 🔍 You’re a Great Fit If You: ✅ Have basic Python knowledge or are eager to learn ✅ Are curious about AI, data modeling, and machine learning ✅ Can dedicate 5–7 hours per week (flexibly) ✅ Are a self-learner and motivated to grow in the data science field ✅ Want to build a strong project portfolio with real use cases 🎁 What You’ll Gain: 📜 Certificate of Completion 📂 Real Projects to Showcase Your Skills 🧠 Practical Knowledge of Data Science Workflows 📈 Experience with Tools Used by Professionals ⏳ Last Date to Apply: 20th June 2025 Whether you’re a student, fresher, or career switcher, this internship is your entry point into the dynamic world of Data Science . 👉 Apply now and bring your data science journey to life with Skillfied Mentor. Show more Show less
Posted 14 hours ago
0 years
0 Lacs
India
Remote
Machine Learning Intern (Paid) Company: Unified Mentor Location: Remote Duration: 3 months Opportunity: Full-time based on performance, with Certificate of Internship Application Deadline : 19th June 2025 About Unified Mentor Unified Mentor provides students and graduates with hands-on learning opportunities and career growth in Machine Learning and Data Science. Role Overview As a Machine Learning Intern, you will work on real-world projects, enhancing your practical skills in data analysis and model development. Responsibilities ✅ Design, test, and optimize machine learning models ✅ Analyze and preprocess datasets ✅ Develop algorithms and predictive models ✅ Use tools like TensorFlow, PyTorch, and Scikit-learn ✅ Document findings and create reports Requirements 🎓 Enrolled in or a graduate of a relevant program (Computer Science, AI, Data Science, or related field) 🧠 Knowledge of machine learning concepts and algorithms 💻 Proficiency in Python or R (preferred) 🤝 Strong analytical and teamwork skills Benefits 💰 Stipend: ₹7,500 - ₹15,000 (Performance-Based) (Paid) ✔ Hands-on machine learning experience ✔ Internship Certificate & Letter of Recommendation ✔ Real-world project contributions for your portfolio Equal Opportunity Unified Mentor is an equal-opportunity employer, welcoming candidates from all backgrounds. Show more Show less
Posted 14 hours ago
0 years
0 Lacs
India
Remote
Job Title: Python Developer Intern Company: Unified Mentor Location: Remote Duration: 3 months Opportunity: Full-time based on performance, with a Certificate of Internship Application Deadline: 19th June 2025 About Unified Mentor Unified Mentor provides students and graduates with hands-on experience, professional training, and career-building opportunities in software development , helping you enhance your coding skills and prepare for a successful career. Role Overview As a Python Developer Intern , you will work on real-world projects , enhance your coding skills, and gain practical experience in software development . Responsibilities ✅ Develop, test, and debug Python applications. ✅ Collaborate on software projects and API integrations. ✅ Learn and apply Python frameworks like Django/Flask . Qualifications 🎓 Enrolled in or completed a Computer Science or related program. 🐍 Proficient in Python programming . 🌐 Familiarity with web frameworks (Django/Flask). 🧩 Strong problem-solving and time-management skills . Perks 💰 Stipend: ₹7,500 - ₹15,000 (Performance-Based) (Paid) ✔ Certificate of Internship & Letter of Recommendation . ✔ Hands-on experience with real-world projects for your portfolio . How to Apply 📩 Submit your application with the subject: "Python Developer Intern Application" . Equal Opportunity Unified Mentor is an equal opportunity employer , welcoming applicants from all backgrounds. Show more Show less
Posted 14 hours ago
8.0 years
0 Lacs
India
Remote
Software Engineer In Kafka 6 yrs exp Able to create dashboards, .Net OR Python skills fine BUT need Kafka & Grafana must Duration: 4- 6 months Hybrid, Remote work any location Kafka-based data pipelines for real-time processing. Implement Kafka producer and consumer applications for efficient data flow. Optimize Kafka clusters for performance, scalability, and reliability. Design and manage Grafana dashboards for monitoring Kafka metrics. Integrate Grafana with Elasticsearch, or other data sources. Set up alerting mechanisms in Grafana for Kafka system health monitoring. Collaborate with DevOps, data engineers, and software teams. Ensure security and compliance in Kafka and Grafana implementations. Requirements: 8+ years of experience in configuring Kafka, Elastic Search and Grafana Strong understanding of Apache Kafka architecture and Grafana visualization. Proficiency in .Net, or Python for Kafka development. Experience with distributed systems and message-oriented middleware. Knowledge of time-series databases and monitoring tools. Familiarity with data serialization formats like JSON. Expertise in Azure platforms and Kafka monitoring tools. Good problem-solving and communication skills. Show more Show less
Posted 14 hours ago
0 years
0 Lacs
India
Remote
Job Title: Data Analyst Intern Company: Unified Mentor Location: Remote Duration: 3 months Opportunity: Full-time based on performance, with a Certificate of Internship About Unified Mentor Unified Mentor offers practical experience for students and graduates in data analysis to enhance career prospects and provide hands-on learning opportunities in a real-world environment. Responsibilities ✅ Collect, clean , and analyze datasets . ✅ Develop reports and data visualizations . ✅ Identify trends and patterns in data. ✅ Collaborate on presentations and insights. Requirements 🎓 Enrolled in or graduate of a relevant program . 💡 Strong analytical skills and attention to detail . 📊 Familiarity with tools like Excel , SQL , or Python (preferred). 🗣 Excellent communication and teamwork abilities . Benefits 💰 Stipend: ₹7,500 - ₹15,000 (Performance-Based) ✔ Real-world data analysis experience . ✔ Certificate of Internship and Letter of Recommendation . ✔ Build your portfolio with impactful projects . How to Apply 📩 Submit your application with "Data Analyst Intern Application" as the subject. 📅 Deadline: 19th June 2025 Note Unified Mentor is an equal opportunity employer , welcoming diverse applicants. Show more Show less
Posted 14 hours ago
0 years
0 Lacs
India
Remote
📊 Data Analyst – Remote | Step Into the Future of Analytics Are you ready to dive into the world of data and make sense of the numbers that drive business decisions? Whether you're a curious learner or looking to kickstart a career in analytics, this internship is designed just for you! 📍 Location: Remote / Virtual 💼 Job Type: Internship (Unpaid) 🕒 Schedule: Flexible working hours 🌟 What’s In It for You? Join Skillfied Mentor , where learning meets action. As a Data Analyst , you’ll work on real projects, learn essential tools, and develop job-ready skills that will help shape your career in analytics. 🔹 Work on real-world datasets and business problems 🔹 Learn tools like Excel, SQL, Power BI/Tableau, and Python (optional) 🔹 Build core skills in data cleaning, visualization, and basic statistics 🔹 Collaborate with a remote team and gain valuable teamwork experience 🔍 You’re a Great Fit If You: ✅ Enjoy working with numbers, data, and patterns ✅ Are eager to explore data tools like Excel, SQL, or Tableau ✅ Have no prior experience but are willing to learn ✅ Can contribute 5–7 hours per week (flexible) ✅ Work well independently in a virtual environment 🎁 What You’ll Gain: 📜 Certificate of Completion 📂 Real Portfolio Projects 🧠 Practical Skills & Hands-on Experience ⏳ Last Date to Apply: 20th June 2025 Whether you're a student, fresher, or looking to switch careers — this internship offers a strong start in Data Analyst . 👉 Apply now and begin your journey with Skillfied Mentor. Show more Show less
Posted 14 hours ago
6.0 years
0 Lacs
India
Remote
Explainable Artificial Intelligence is an approach to building AI systems that can provide clear explanations for their decisions and actions. It aims to increase transparency and trust in AI by enabling humans to understand how AI algorithms arrive at their choices. This is especially important in healthcare, where AI systems are increasingly used to support business decisions. We are HealthWorksAI™; our mission is to develop and implement cutting-edge XAI solutions for designing health insurance products, specifically for Medicare Advantage. We aim to create cost-effective healthcare plans tailored to individual needs, ensuring optimal patient care at affordable prices. By identifying and mitigating bias in plan design, we are committed to promoting fairness and inclusivity, ultimately empowering insurers to make informed decisions that elevate the quality of care and benefit all stakeholders. Job Summary: As a Product Lead, you'll collaborate closely with the Product Manager, shaping the next generation of healthcare solutions for Payers in the Medicare Advantage ecosystem. Fuelled by Artificial Intelligence and Machine Learning, our commitment lies in improving the health of beneficiaries. Our best ideas come from our clients. You will stay tuned to customer and prospect feedback to ensure the customer's voice is fully incorporated into the roadmap, and we are constantly working on features that deliver the highest impact for their investment in HealthWorksAI. What You’ll Do: · Ideation and Validation : Researching the competitive landscape, partnering with industry analysts like Gartner, and validating product ideas through client interactions. · Design Thinking Magic : Utilizing Design Thinking, create intuitive wireframes and workflows that resonate with our client's needs. · Turning Dreams into Reality : Collaborating with data engineering and product engineering teams to build robust end-to-end data pipelines, seamlessly merging technology with innovation. · Inception to Adoption : Collaborating with the customer success team for initial implementations to drive adoption and utilization. · Feedback-Driven Excellence: Continuously gathering client and prospect feedback to refine and optimize our product strategies. Experience You’ll Need: · Academic Excellence: A graduate from IIT/NIT or BITS Pilani · Invaluable Experience: 6+ years of experience as a Product Analyst, preferably in Healthcare Insurance. · Tech-Savvy Expertise : Proficiency in SQL, Tableau, Python, and Advanced Excel. · Knowhow: Knowledge of network adequacy and key components of provider networks such as PCPs, specialists, hospitals, and long-term care facilities will be a plus. What You’ll Bring to The Table: · Crafting Persuasive Product Briefs : Ability to translate business ideas, wish lists, and client requirements into compelling Product briefs. · Mastering Data-Driven Solutions: Proficient in data engineering, data visualization, and modelling techniques. · Creativity Unleashed : Create stunning prototypes using Tableau and Excel. Why You Will Love Working Here: · Opportunity to work on exciting products that make a real difference · A young-millennial team to work with · Open leave policy and flexible working hours · Complete WFH with multiple work retreats at a neutral location Show more Show less
Posted 14 hours ago
0 years
0 Lacs
India
Remote
Data Analyst Intern Company: Unified Mentor Location: Remote Duration: 3 months Opportunity: Full-time based on performance, with a Certificate of Internship About Unified Mentor Unified Mentor offers practical experience for students and graduates in data analysis to enhance career prospects and provide hands-on learning opportunities in a real-world environment. Responsibilities ✅ Collect, clean , and analyze datasets . ✅ Develop reports and data visualizations . ✅ Identify trends and patterns in data. ✅ Collaborate on presentations and insights. Requirements 🎓 Enrolled in or graduate of a relevant program . 💡 Strong analytical skills and attention to detail . 📊 Familiarity with tools like Excel , SQL , or Python (preferred). 🗣 Excellent communication and teamwork abilities . Benefits 💰 Stipend: ₹7,500 - ₹15,000 (Performance-Based) ✔ Real-world data analysis experience . ✔ Certificate of Internship and Letter of Recommendation . ✔ Build your portfolio with impactful projects . How to Apply 📩 Submit your application with "Data Analyst Intern Application" as the subject. 📅 Deadline: 19th June 2025 Note Unified Mentor is an equal opportunity employer , welcoming diverse applicants. Show more Show less
Posted 14 hours ago
0 years
0 Lacs
India
Remote
👉 Take the First Step Toward Your Dream Career Software Developer Intern – Infython Technologies 🌍 Location: Remote 📅 Duration: 6 months (3 months training + 3 months real-world project work) 🚀 Opportunity: Full-time Job offer based on performance 🎁 Perks: Internship Certificate, Letter of Recommendation, Paid Stipend, Mentorship About Infython Technologies Infython is a global technology company trusted by clients around the world. We’re a team of passionate engineers delivering end-to-end software solutions with a commitment to trust, innovation, and excellence. Our mission is to empower startups and growing businesses with transformative technology and long-term partnerships. At Infython, we take pride in shaping the future—by helping the next generation of developers grow. Role Overview As a Software Developer Intern, you’ll begin with structured hands-on training and progress to working on real client projects. Guided by experienced engineers, you’ll contribute to building products used in real-world scenarios. This internship is ideal for candidates who are passionate about coding, eager to learn, and looking to turn their knowledge into practical skills that matter. Key Responsibilities ✅ Develop, test, and debug Python-based applications ✅ Collaborate on software projects and API integrations ✅ Apply coding best practices and contribute to clean, scalable code ✅ Participate in code reviews and agile team discussions ✅ Work with tools and workflows used by professional development teams Qualifications 🎓 Pursuing or recently completed a degree in Computer Science or related field 💻 Basic understanding of Python and software development principles 🧠 Strong problem-solving and time-management skills 🤝 Eagerness to learn, contribute, and grow in a team environment What You’ll Gain 💰 Stipend: ₹3,000 – ₹15,000 (based on performance after training) 📜 Certificate of Internship & Letter of Recommendation 📁 Real-world project experience to build your portfolio 👨🏫 Mentorship from senior developers 🎯 Eligibility for Pre-Placement Offer (PPO) based on performance Why Choose Infython? At Infython, we invest in your professional development—helping you continuously improve your skills, gain real experience, and grow into a confident, capable software engineer. ✔ Learn directly from industry experts ✔ Build practical, job-ready skills ✔ Experience a professional, collaborative engineering culture ✔ Get a head start in your tech career with strong references and connections 🌍 Equal Opportunity Employer Infython is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive, supportive environment for all team members. 🌐 Visit us to learn more: www.infython.com. Show more Show less
Posted 14 hours ago
0 years
0 Lacs
India
Remote
Freelance Opportunity: Statistician – Predictive Modeling & Advanced Analytics (Remote | ₹70,000/month) 📅 Duration: 3-Month Contract (Renewable) 💰 Compensation: ₹70,000/month (net, based on deliverables) We’re hiring a freelance statistician who loves building powerful predictive models using advanced statistical and machine learning techniques. If you’re skilled in Bayesian modeling , clustering algorithms , and discrete choice models , and enjoy solving complex real-world problems, we’d love to work with you! 🔍 What You'll Work On: Bayesian probabilistic models (PyMC3, Stan, etc.) Advanced regression (GLM, regularized, hierarchical) Clustering algorithms: AHC, KNN, LCA, SVM, SOM Discrete Choice Modeling: MNL, Nested Logit, Mixed Logit ML techniques for prediction, segmentation, classification Clear documentation and delivery of code + insights ✅ What We’re Looking For: Postgrad in Statistics, or Mathematics field - This is a must have, preferably from ISI or such Institutes. Strong in Python and/or R (esp. NumPy, pandas, scikit-learn) Practical experience with Bayesian tools, ML libraries Bonus if you’ve worked in consumer research or behavior modeling Self-driven, reliable, and outcome-focused 📌 How It Works: Remote, freelance contract Work independently with regular check-ins Monthly payment based on pre-agreed deliverables Show more Show less
Posted 14 hours ago
6.0 years
0 Lacs
India
On-site
It takes powerful technology to connect our brands and partners with an audience of hundreds of millions of people. Whether you’re looking to write mobile app code, engineer the servers behind our massive ad tech stacks, or develop algorithms to help us process trillions of data points a day, what you do here will have a huge impact on our business—and the world. About Us Yahoo delivers delightful, inspiring and entertaining daily-habit experiences to over half a billion people worldwide. Our products include the Yahoo Homepage (www.yahoo.com), AOL, as well as Comscore #1 sites in News, Sports and Finance. Yahoo in Three Words: Inform, connect, and entertain. The Enterprise Application team is responsible for managing the financial systems along with other custom home grown applications which cater to the needs of the financial teams. We build and maintain applications to ensure Yahoo is able to serve the customers and finance teams, using Oracle R12 and a combination of open source software and internal tools. We encourage new ideas and continuously experiment and evaluate new technologies to assimilate them into our infrastructure. Our team structure encourages trust, learning from one another, having fun, and attracting people who are passionate about what they do. About You You are a self-starter and problem solver, who is passionate about velocity, developer productivity and product quality. You are an aggressive trouble-shooter who can multitask on problems of varying difficulty, priority and time-sensitivity and get things done. You are smart, self-driven, and spend time trying to figure out how something works, not stopping with knowing just what it does. You like to relentlessly automate everything and anything at scale. Job Responsibilities/The Role/The Job This position is for a Production Engineer II with extensive experience in the support & administration of complex applications/systems deployment, infrastructure upgrades, software upgrades, patching and ongoing end-to-end support of mission critical applications. Some of these applications are home grown custom applications facing Yahoo’s internal customer and others are Corporate Sites facing Yahoo’s external customers. This position will be responsible for defining, implementing and maintaining the standard operating procedures for the operations team within the Corporate Applications group. This position will partner closely with relevant business process owners, application developers in Bangalore and Sunnyvale and other Corporate Applications team members to deliver global solutions with an objective of optimizing processes. The individual must have solid experience and understanding of system, database & integration technologies and be responsible for 24/7/365 availability, scalability and incident response. Responsibilities include: Understand existing project design, monitoring setup, and automation. Providing expert advice and direction in Applications & database administration and configuration technologies that include host configuration, monitoring, change & release management, performance tuning, hardware & capacity planning, and upgrades. Design tools for managing the infrastructure and program clean & re-usable simple codes. Troubleshoot, resolve, and document production issues and escalate as required. Proactively maintain and develop all Linux infrastructure technology to maintain a 24x7x365 uptime service Develop and implement automation tools for managing production systems. Be part of global on-call (12/7) rotation. Being responsible for database design, performance, and monitoring of various versions of MySQL or SQL Server databases, database tools, and services Problem diagnosis and resolution of moderate to advance production issues Develop and deploy platform infrastructure tools such as monitoring, alerting, and orchestration Build independent web-based tools, microservices, and solutions. Writing reusable, testable, and efficient code Ability to design and develop a business operations model for large applications to provide support for business needs. Experience in dealing with difficult situations and making decisions with a sense of urgency. Monitoring and reporting metrics related to performance, availability, and other SLA measures Developing, implementing, and maintaining change control and testing processes for modifications to all applications environments Design and implement redundant systems, policies, and procedures for disaster recovery and data archiving to ensure effective protection and integrity of data assets Work with application development staff to harden, enhance, document, and generally improve the operability of our systems Minimum Job Qualifications Bachelor's degree in Computer Science, Engineering, Information Systems or similar relevant degree 6 to 8 years of experience in Linux systems, web applications, distributed computing, and computer networking. Hands-on in various DevOps tools like GIT, Jenkins, Ansible, Terraform, Docker, Jira, Slack, Confluence, Nagios, and Kubernetes Experience in container orchestration services, especially Kubernetes Fair understanding of major public cloud service providers, like Amazon Web Services (AWS), Microsoft Azure, Google Cloud Platform, and private cloud like OpenStack Expert in Python, with knowledge of at least one Python web framework such as Django / Flask Basic understanding of front-end technologies, such as JavaScript, HTML5, and CSS3 Understanding of databases - Relational and Non-Relational - their data models and Performance trade-offs. Hands-on experience in MySQL is preferred. In-depth knowledge of Linux: RedHat, CentOS, etc. Linux certifications (RHCT, RHCE, and LPIC) will be considered an advantage. Excellent communication, interpersonal, and team working skills. Good coding skills in BASH, Python, and Perl Experience in developing web applications and familiarity with at least one framework (Django, Flask) is desired. Basic web development skills using HTML5, CSS are mandatory. Strong desire to learn and understand new concepts, technologies, systems as part of day-to-day work. Solid knowledge of principles, concepts, and theories of virtual infrastructure and container platform orchestration. Ability to apply independent judgment to develop creative, practical, and repeatable solutions Knowledge of Hadoop, HBase, spark is preferred Working knowledge of HTTP, DNS, and DHCP is preferred. Important notes for your attention Applications: All applicants must apply for Yahoo openings direct with Yahoo. We do not authorize any external agencies in India to handle candidates’ applications. No agency nor individual may charge candidates for any efforts they make on an applicant’s behalf in the hiring process. Our internal recruiters will reach out to you directly to discuss the next steps if we determine that the role is a good fit for you. Selected candidates will go through formal interviews and assessments arranged by Yahoo direct. Offer Distributions: Our electronic offer letter and documents will be issued through our system for e-signatures, not via individual emails. Yahoo is proud to be an equal opportunity workplace. All qualified applicants will receive consideration for employment without regard to, and will not be discriminated against based on age, race, gender, color, religion, national origin, sexual orientation, gender identity, veteran status, disability or any other protected category. Yahoo will consider for employment qualified applicants with criminal histories in a manner consistent with applicable law. Yahoo is dedicated to providing an accessible environment for all candidates during the application process and for employees during their employment. If you need accessibility assistance and/or a reasonable accommodation due to a disability, please submit a request via the Accommodation Request Form (www.yahooinc.com/careers/contact-us.html) or call +1.866.772.3182. Requests and calls received for non-disability related issues, such as following up on an application, will not receive a response. Yahoo has a high degree of flexibility around employee location and hybrid working. In fact, our flexible-hybrid approach to work is one of the things our employees rave about. Most roles don’t require specific regular patterns of in-person office attendance. If you join Yahoo, you may be asked to attend (or travel to attend) on-site work sessions, team-building, or other in-person events. When these occur, you’ll be given notice to make arrangements. If you’re curious about how this factors into this role, please discuss with the recruiter. Currently work for Yahoo? Please apply on our internal career site. Show more Show less
Posted 14 hours ago
6.0 years
0 Lacs
India
Remote
Job Role: Data Scientist Job Type: Full-Time Mode of work: Remote Experience: 6+ Years Key Responsibilities Analyze complex datasets to uncover insights using AI techniques. Build and deploy machine learning, deep learning, and NLP models. Collaborate with teams to understand client needs and deliver AI-powered solutions. Communicate insights clearly to various stakeholders. Continuously refine models for performance and accuracy. Stay current on AI advancements and apply them in real-world scenarios. Contribute to AI tools and infrastructure development. Qualifications Bachelor’s/Master’s in Statistics, CS, Engineering, or related field. 6+ years of experience in IT and 3+ years of hands-on experience in Data Science/AI. Strong expertise in Python (Pandas, NumPy, scikit-learn, Streamlit) and SQL. Experience with ML/DL frameworks (e.g., TensorFlow, PyTorch). Familiarity with Agile methodologies and cloud platforms (AWS, Azure, GCP). Solid communication and problem-solving skills. Knowledge of RAG implementation and AI agents is a plus. Interest or background in Corporate Finance is an added advantage. Show more Show less
Posted 14 hours ago
10.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description Job Title – Devops Engineer with Manual Testing Candidate Specification – 10+ years, Notice Period – Immediate to 30 days, Hybrid. Job Summary We are seeking a skilled DevOps Engineer with a strong foundation in manual testing of web and windows applications. The ideal candidate will have hands-on experience in test execution, bug reporting, and DevOps pipelines, with exposure to automation tools and infrastructure-as-code practices being a strong advantage. Mandatory Skills Manual Testing experience for both Web Applications and Windows Desktop Applications Strong understanding of STLC, bug life cycle, test case design, and test execution Ability to perform UI, functional, regression, and exploratory testing Experience using tools like JIRA, TestLink, or similar for defect tracking and test management Optional / Preferred Skills DevOps Tools: Experience with Jenkins, CI/CD pipelines Infrastructure Automation: Exposure to Ansible Automation Testing: Knowledge of Appium, Robot Framework API Testing: Familiarity with SoapUI or Postman Scripting Languages: Basic knowledge of Python for automation or testing Familiarity with source control tools like Git Understanding of agile development practices Roles And Responsibilities Perform manual testing on web and desktop applications to identify bugs and ensure quality Collaborate with developers and product managers to understand features and functionality Create, review, and maintain detailed and well-structured test plans and test cases Report and track bugs using bug tracking systems, ensuring proper documentation and resolution Work closely with DevOps teams to understand deployment pipelines and participate in CI/CD processes Support test automation efforts using tools like Appium or Robot Framework (if applicable) Assist in setting up test environments and validating deployments Skills Required RoleDevops Engineer with Manual Testing - Contract Hiring Industry TypeIT/ Computers - Software Functional Area Required Education Bachelor Degree Employment TypeFull Time, Permanent Key Skills APPIUM APM DEVOPS MANUAL TESTING PYTHON Other Information Job CodeGO/JC/382/2025 Recruiter NameChristopher Show more Show less
Posted 14 hours ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Title: Data Engineer Location: Hyderabad, Kochi, Trivandrum Experience Required: 10-19 Yrs Skills: Primary - Scala, Pyspark, Python / Secondary - ETL, SQL, Azure Role Proficiency The role demands expertise in building robust, scalable data pipelines that support ingestion, wrangling, transformation, and integration of data from multiple sources. The ideal candidate should have hands-on experience with ETL tools (e.g., Informatica, AWS Glue, Databricks, GCP DataProc), and strong programming skills in Python, PySpark, SQL, and optionally Scala. Proficiency across various data domains and familiarity with modern data warehouse and lakehouse architectures (Snowflake, BigQuery, Delta Lake, Lakehouse) is essential. A solid understanding of DevOps and infrastructure cost optimization is required. Key Responsibilities & Outcomes Technical Development Develop high-performance data pipelines and applications. Optimize development using design patterns and reusable solutions. Create and tune code using best practices for performance and scalability. Develop schemas, data models, and data storage solutions (SQL/NoSQL/Delta Lake). Perform debugging, testing, and validation to ensure solution quality. Documentation & Design Produce high-level and low-level design (HLD, LLD, SAD) and architecture documentation. Prepare infra costing, source-target mappings, and business requirement documentation. Contribute to and govern documentation standards/templates/checklists. Project & Team Management Support Project Manager in planning, delivery, and sprint execution. Estimate effort and provide input on resource planning. Lead and mentor junior team members, define goals, and monitor progress. Monitor and manage defect lifecycle including RCA and proactive quality improvements. Customer Interaction Gather and clarify requirements with customers and architects. Present design alternatives and conduct product demos. Ensure alignment with customer expectations and solution architecture. Testing & Release Design and review unit/integration test cases and execution strategies. Provide support during system/integration testing and UAT. Oversee and execute release cycles and configurations. Knowledge Management & Compliance Maintain compliance with configuration management plans. Contribute to internal knowledge repositories and reusable assets. Stay updated and certified on relevant technologies/domains. Measures of Success (KPIs) Adherence to engineering processes and delivery schedules. Number of post-delivery defects and non-compliance issues. Reduction in recurring defects and faster resolution of production bugs. Timeliness in detecting, responding to, and resolving pipeline/data issues. Improvements in pipeline efficiency (e.g., runtime, resource utilization). Team engagement and upskilling; completion of relevant certifications. Zero or minimal data security/compliance breaches. Expected Deliverables Code High-quality data transformation scripts and pipelines. Peer-reviewed, optimized, and reusable code. Documentation Design documents, technical specifications, test plans, and infra cost estimations. Configuration & Testing Configuration management plans and test execution results. Knowledge Sharing Contributions to SharePoint, internal wikis, client university platforms. Skill Requirements Mandatory Technical Skills Languages : Python, PySpark, Scala ETL Tools : Apache Airflow, Talend, Informatica, AWS Glue, Databricks, DataProc Cloud Platforms : AWS, GCP, Azure (esp. BigQuery, DataFlow, ADF, ADLS) Data Warehousing : Snowflake, BigQuery, Delta Lake, Lakehouse architecture Performance Tuning : For large-scale distributed systems and pipelines Additional Skills Experience in data model design and optimization. Good understanding of data schemas, window functions, and data partitioning strategies. Awareness of data governance, security standards, and compliance. Familiarity with DevOps, CI/CD, infrastructure cost estimation. Certifications (Preferred) Cloud certifications (e.g., AWS Data Analytics, GCP Data Engineer) Informatica or Databricks certification Domain-specific certifications based on project/client need Soft Skills Strong analytical and problem-solving capabilities Excellent communication and documentation skills Ability to work independently and collaboratively in cross-functional teams Stakeholder management and customer interaction Show more Show less
Posted 14 hours ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
Remote
LivePerson (NASDAQ: LPSN) is the global leader in enterprise conversations. Hundreds of the world’s leading brands — including HSBC, Chipotle, and Virgin Media — use our award-winning Conversational Cloud platform to connect with millions of consumers. We power nearly a billion conversational interactions every month, providing a uniquely rich data set and safety tools to unlock the power of Conversational AI for better customer experiences. At LivePerson, we foster an inclusive workplace culture that encourages meaningful connection, collaboration, and innovation. Everyone is invited to ask questions, actively seek new ways to achieve success, nd reach their full potential. We are continually looking for ways to improve our products and make things better. This means spotting opportunities, solving ambiguities, and seeking effective solutions to the problems our customers care about. Overview LivePerson is experiencing rapid growth, and we’re evolving our database infrastructure to scale faster than ever. We are building a team dedicated to optimizing data storage, accessibility, and performance across our applications. As a Senior Database Engineer, you will be a key contributor, driving innovation in cloud database solutions and automation. You Will Partner with cross-functional teams to define database requirements and architectural strategies. Design, implement, and maintain highly scalable, on-prem and cloud-based database systems on Google Cloud Platform (GCP). Develop automation solutions using Terraform, Ansible, and Python to streamline database provisioning and management. Ensure robust version control of infrastructure configurations for seamless deployments. Monitor, troubleshoot, and optimize database performance, addressing bottlenecks proactively. Establish and enforce backup, recovery, and disaster recovery protocols to protect data integrity. Collaborate with security teams to implement compliance and data protection measures. Lead incident resolution, analyzing root causes and driving long-term solutions. Stay ahead of industry trends in DevOps, cloud computing, and database technologies. Participate in on-call rotations, ensuring 24x7 support for mission-critical systems. You Have 8+ years of experience managing large-scale production database systems handling terabytes of data. Expertise in MySQL administration & replication. Experience with anyone of Elasticsearch, Kafka, Hadoop, and Vertica is plus Strong background in Google Cloud Platform (GCP) or AWS database deployments. Proficiency in Infrastructure as Code (IaC) using Terraform & Ansible. Skilled in Python & Bash scripting for automation. Hands-on experience with Liquibase or Flyway for database automation. Knowledge of monitoring tools like Prometheus, Grafana, PMM (Percona Monitoring and Management) and ELK stack (Elasticsearch, Kibana & Logstash). Strong problem-solving skills with a proactive approach to troubleshooting complex issues. Solid foundation in database architecture, optimization, and CI/CD concepts. Excellent collaboration & communication skills in a dynamic team environment. Highly accountable with a results-driven mindset. Able to create documentation, work on changes, incidents and jira tickets. Relevant certifications (AWS, GCP) are a plus. Benefits Health: Medical, Dental and Vision Time away: Vacation and holidays Equal opportunity employer Why You’ll Love Working Here As leaders in enterprise customer conversations, we celebrate diversity, empowering our team to forge impactful conversations globally. LivePerson is a place where uniqueness is embraced, growth is constant, and everyone is empowered to create their own success. And, we're very proud to have earned recognition from Fast Company, Newsweek, and BuiltIn for being a top innovative, beloved, and remote-friendly workplace. Belonging At LivePerson We are proud to be an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to age, ancestry, color, family or medical care leave, gender identity or expression, genetic information, marital status, medical condition, national origin, physical or mental disability, protected veteran status, race, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable laws, regulations and ordinances. We also consider qualified applicants with criminal histories, consistent with applicable federal, state, and local law. We are committed to the accessibility needs of applicants and employees. We provide reasonable accommodations to job applicants with physical or mental disabilities. Applicants with a disability who require reasonable accommodation for any part of the application or hiring process should inform their recruiting contact upon initial connection. The talent acquisition team at LivePerson has recently been notified of a phishing scam targeting candidates applying for our open roles. Scammers have been posing as hiring managers and recruiters in an effort to access candidates' personal and financial information. This phishing scam is not isolated to only LivePerson and has been documented in news articles and media outlets. Please note that any communication from our hiring teams at LivePerson regarding a job opportunity will only be made by a LivePerson employee with an @ liveperson.com email address. LivePerson does not ask for personal or financial information as part of our interview process, including but not limited to your social security number, online account passwords, credit card numbers, passport information and other related banking information. If you have any questions and or concerns, please feel free to contact recruiting-lp@liveperson.com Show more Show less
Posted 14 hours ago
3.0 - 5.0 years
15 - 30 Lacs
Bengaluru
Work from Office
Position summary: We are seeking a Senior Software Development Engineer – Data Engineering with 3-5 years of experience to design, develop, and optimize data pipelines and analytics workflows using Snowflake, Databricks, and Apache Spark. The ideal candidate will have a strong background in big data processing, cloud data platforms, and performance optimization to enable scalable data-driven solutions. Key Responsibilities: Work with cloud-based data solutions (Azure, AWS, GCP). Implement data modeling and warehousing solutions. Developing and maintaining data pipelines for efficient data extraction, transformation, and loading (ETL) processes. Designing and optimizing data storage solutions, including data warehouses and data lakes. Ensuring data quality and integrity through data validation, cleansing, and error handling. Collaborating with data analysts, data architects, and software engineers to understand data requirements and deliver relevant data sets (e.g., for business intelligence). Implementing data security measures and access controls to protect sensitive information. Monitor and troubleshoot issues in data pipelines, notebooks, and SQL queries to ensure seamless data processing. Develop and maintain Power BI dashboards and reports. Work with DAX and Power Query to manipulate and transform data. Basic Qualifications Bachelor’s or master’s degree in computer science or data science 3-5 years of experience in data engineering, big data processing, and cloud-based data platforms. Proficient in SQL, Python, or Scala for data manipulation and processing. Proficient in developing data pipelines using Azure Synapse, Azure Data Factory, Microsoft Fabric. Experience with Apache Spark, Databricks and Snowflake is highly beneficial for handling big data and cloud-based analytics solutions. Preferred Qualifications Knowledge of streaming data processing (Apache Kafka, Flink, Kinesis, Pub/Sub). Experience in BI and analytics tools (Tableau, Power BI, Looker). Familiarity with data observability tools (Monte Carlo, Great Expectations). Contributions to open-source data engineering projects.
Posted 14 hours ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
It's fun to work in a company where people truly BELIEVE in what they're doing! Job Description The Engineering Team at Traveloka is the backbone of our innovation, building scalable and high-performance systems that power millions of users worldwide. With a relentless focus on scalability and performance, we ensure a seamless experience for travelers. Our dedication to excellence and problem-solving makes us instrumental in shaping Traveloka's future as a leader in the digital travel space. You’ll be joining a team that has built industry-leading, high-performance backend systems. Our engineers work on cutting-edge technologies, solving complex challenges in distributed computing, API design, and high-traffic systems. Backend engineering is evolving rapidly, and we’re looking for developers who are eager to build the next big thing in scalable backend architecture. What You'll be Doing: Design, build, and maintain scalable backend applications, APIs, and data integrations. Write clean, efficient, and maintainable code while adhering to best practices. Improve and optimize existing systems for better performance and reliability. Monitor system performance, troubleshoot issues, and respond to alerts proactively. Collaborate with cross-functional teams to deliver high-impact solutions. Conduct code reviews, unit testing, and integration testing to ensure software quality. Participate in architectural discussions and propose innovative solutions for high-traffic environments. Contribute to post-mortem analyses and continuously improve system resilience. Requirements Bachelor’s degree in Computer Science from a reputable university. Minimum 5 years of experience in software engineering, particularly in backend development. Proficiency in Go, Java, and Python. Ability to dive deep and work across backend services, ensuring efficiency, scalability, and maintainability. Experience designing scalable and maintainable architectures. Eagerness to continuously learn—whether it’s technology-related, product-related, or beyond. A strong sense of ownership and accountability for both the product(s) and assigned tasks. Fluency in English, both spoken and written. Prior technical engineering experience or relevant work experience is a plus. If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us! Show more Show less
Posted 14 hours ago
0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Job Description Strong technical acumen with experience in BI, SQL, and Python. Proficiency in Generative AI and prompt engineering skills, with a good understanding of the LLM ecosystem. Knowledge of Agentic AI and its applications in business processes. Hands-on experience with Agile and Incremental delivery models. Excellent multitasking and stakeholder communication skills. Ability to estimate efforts and manage project timelines effectively. Experience in creating and managing project documentation and reports. Strong problem-solving skills and attention to detail. Skills Required RoleManager - Technical BA Industry TypeITES/BPO/KPO Functional AreaITES/BPO/Customer Service Required Education BACHELOR IN TECHNOLOGY Employment TypeFull Time, Permanent Key Skills BUSINESS ANALYST API GENAI Other Information Job CodeGO/JC/290/2025 Recruiter Name Show more Show less
Posted 14 hours ago
7.0 - 12.0 years
8 - 16 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Role & responsibilities Minimum of 8 years of related experience Bachelor's degree or equivalent experience Expertise in Python; solid experience in building enterprise applications using Python. Experience with Python multithreading and multiprocessing applications Good understanding of pandas, libraries. Strong experience with SQL (ideally Snowflake) Familiarity with AWS Cloud technologies Experience working in Agile Development teams Experience with enterprise CI/CD tools ex: Jenkins, version control systems ex: Git Experience with development of financial applications, ideally related to Risk Management Hands on experience with writing automated test cases Ex: pytest or unittest Ability to work independently and in a distributed team environment Preferred candidate profile
Posted 14 hours ago
7.0 - 12.0 years
15 - 25 Lacs
Pune, Chennai, Bengaluru
Work from Office
Greetings from Sigma!! Job DescriptionRole: Senior Full Stack Developer (Python + React) Experience: 7 to 10 Years Location: PAN India Job Summary: Looking for an experienced Full Stack Developer with 7 to 10 years of experience, specializing in Python backend and React frontend development with AWS. The ideal candidate will have strong expertise in developing scalable web applications using modern technologies and frameworks.Key Responsibilities: Design, develop, and maintain scalable and high-performance full-stack web applications. Build RESTful APIs and microservices using Python frameworks like Django / Flask / FastAPI. Develop rich and responsive user interfaces using React.js. Collaborate with Product Owners, Architects, and other stakeholders to translate business requirements into technical solutions. Write clean, maintainable, and efficient code following industry best practices. Conduct code reviews, mentor junior developers, and ensure code quality. Work on API integrations, third-party services, and cloud deployments. Optimize applications for maximum speed and scalability. Debug, troubleshoot, and resolve production issues. Mandatory Skills: 7 to 10 years of overall software development experience. Strong hands-on experience in Python with frameworks like Django / Flask / FastAPI. Strong front-end development skills using React.js, JavaScript, HTML5, and CSS3. Experience in RESTful API design and development. Good understanding of databases (SQL & NoSQL like PostgreSQL, MySQL, MongoDB, etc.). Experience with Git for version control. Strong debugging and performance optimization skills. Good understanding of Microservices architecture. If you are interested, please send me a copy of your resume along with the following details in this mail id- sumati@sigmacareers.in 1. Notice Period (LWD)- 2. Current CTC- 3. Expected CTC - 4. Current company- 5. Total year of experience- 6. Relevant experience- 7. Do you have any offer - 8. current location- 9. Preferred location-
Posted 14 hours ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
you’ll be our: CFD & Thermals Simulation Engineer you’ll be based at: IBC Knowledge Park, Bengaluru you’ll be aligned with: Team Lead - CFD & Thermal Simulation you’ll be a member of: Simulation CoC What you’ll do at Ather: Ather Energy has proved itself to be the hallmark and the synonym for Smart,Connected and Intelligent Electric Vehicles in India. Today we are on a journey of exponential growth and marking our presence across the length and breadth of Indian roads. Continuing this journey further means massive ramping up of production and developing many more top-notch products to redefine and upgrade Customers’ Experience in India. At Ather, we believe not just in creating products but in delivering a complete product experience to our customers. This is only possible by exploring non-conventional platforms and pushing the boundaries of engineering design. One such limiting factor is heat and hence thermal management plays a key role in ensuring the designs perform to their required specification and also remain safe during the operation. CFD and Thermal Simulations help play a very critical role in this process to ensure the design is optimized and potential failure modes are identified early in the product development cycle. We are looking for a CFD & Thermal Simulation Engineer who thinks from the first principles and can evaluate the thermal performance of various subsystems and provide insights to improve the same. As our CFD & Thermal Simulation Engineer, you’ll: Plan, set up and execute CFD simulations to support various programs for systems like battery pack, motor and motor drive, chargers and other electronic components. Interpret CFD/Thermal results to provide relevant insights on the flow and thermal phenomena under consideration. Provide technical support and contribute to the solution of fluid-thermal issues on the product. Develop simulation capabilities & correlation of simulation results with test data for 2-Wheeler EV systems. Develop scripts for automation and work towards improving simulation methodology to reduce time and increased accuracy. Prepare standard documents for CFD and Thermal simulation procedures and own reports for all simulation activity. Here’s what we are looking for: Good understanding in the theoretical and practical aspects of fluid flow and heat transfer. Strong understanding and experience in working with commercial CFD codes, preferably Star-CCM+ or Ansys Fluent. Demonstrated industrial or academic experience in addressing flow and thermal challenges on an automotive or electronics platform. Knowledge of CAE Process Automation through Python, Java or any other language will be an added advantage. A fair understanding of the various systems and subsystems in EV’s will be preferred. You bring to Ather: B.E/M.Tech/M.S. in Mechanical Engineering or any other relevant stream. Computational Fluid Dynamics and Thermal Simulation expertise with minimum 3-5 years of experience in an automotive/aerospace company. Show more Show less
Posted 14 hours ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description Job Title - Business Intelligence – Senior Manager /AGM – Mumbai Job Responsibilities 13+ Work Experience in leading the development and implementation of data analysis solutions focused on financial data. Stay updated on emerging BI technologies and tools to optimize data analysis capabilities. Evaluate and implement new data warehousing solutions to manage large volumes of financial data. Proficiency in BI tools like Tableau, Power BI, or similar platforms Programming skills (SQL, Python) for data manipulation and analysis. Excellent communication skills to present data findings effectively to both technical and non-technical stakeholders Develop and execute a comprehensive financial data strategy to identify key performance indicators (KPIs) and align with business objectives. Establish data quality standards and implement data cleansing processes to ensure accuracy of financial data. Design and build data models, dashboards, and reports that visualize key financial metrics like revenue, profitability, cash flow, and cost analysis. Utilize BI tools to create interactive and user-friendly dashboards for decision-making across departments. Develop advanced analytics capabilities including predictive modelling and forecasting to identify trends and potential risks. Manage a team of BI analysts, assigning tasks, providing guidance, and mentoring to develop their data analysis skills. Generate regular financial performance reports to track key metrics and identify areas for improvement. Conduct ad-hoc analysis to address specific business questions and provide data-driven recommendations to senior management. Strong analytical and problem-solving skills. Ability to interpret data and translate insights into actionable business recommendations. Excellent communication and presentation skills to convey complex financial data to stakeholders Flexible with shifts including night shifts. Skills Required RoleBusiness Intelligence – Senior Manager /AGM – Mumbai Industry TypeITES/BPO/KPO Functional Area Required Education B Com Employment TypeFull Time, Permanent Key Skills BUSINESS INTELLIGENCE DATA ANALYTIC S POWER BI TABLEAU Other Information Job CodeGO/JC/389/2025 Recruiter NameMarilakshmi S Show more Show less
Posted 14 hours ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Python has become one of the most popular programming languages in India, with a high demand for skilled professionals across various industries. Job seekers in India have a plethora of opportunities in the field of Python development. Let's delve into the key aspects of the Python job market in India:
The average salary range for Python professionals in India varies based on experience levels. Entry-level positions can expect a salary between INR 3-6 lakhs per annum, while experienced professionals can earn between INR 8-20 lakhs per annum.
In the field of Python development, a typical career path may include roles such as Junior Developer, Developer, Senior Developer, Team Lead, and eventually progressing to roles like Tech Lead or Architect.
In addition to Python proficiency, employers often expect professionals to have skills in areas such as: - Data Structures and Algorithms - Object-Oriented Programming - Web Development frameworks (e.g., Django, Flask) - Database management (e.g., SQL, NoSQL) - Version control systems (e.g., Git)
__str__
and __repr__
methods in Python. (medium)__init__
method in Python? (basic)append()
and extend()
methods in Python lists? (basic)__name__
variable in Python? (medium)pass
statement in Python? (basic)As you explore Python job opportunities in India, remember to brush up on your skills, prepare for interviews diligently, and apply confidently. The demand for Python professionals is on the rise, and this could be your stepping stone to a rewarding career in the tech industry. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.