Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 years
15 - 30 Lacs
Patna, Bihar, India
Remote
Experience : 4.00 + years Salary : INR 1500000-3000000 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: NuStudio.AI) (*Note: This is a requirement for one of Uplers' client - AI-first, API-powered Data Platform) What do you need for this opportunity? Must have skills required: Databricks, dbt, Delta Lake, Spark, Unity catalog, AI, Airflow, Cloud Function, Cloud Storage, Databricks Workflows, Dataflow, ETL/ELT, Functions), GCP (BigQuery, Pub/Sub, PySpark, AWS, Hadoop AI-first, API-powered Data Platform is Looking for: We’re scaling our platform and seeking Data Engineers (who are passionate about building high-performance data pipelines, products, and analytical pipelines in the cloud to power real-time AI systems. As a Data Engineer, you’ll: Build scalable ETL/ELT and streaming data pipelines using GCP (BigQuery, Pub/Sub, PySpark, Dataflow, Cloud Storage, Functions) Orchestrate data workflows with Airflow, Cloud Functions, or Databricks Workflows Work across batch + real-time architectures that feed LLMs and AI/ML systems Own feature engineering pipelines that power production models and intelligent agents Collaborate with platform and ML teams to design observability, lineage, and cost-aware performant solutions Bonus: Experience with AWS, Databricks, Hadoop (Delta Lake, Spark, dbt, Unity Catalog) or interest in building on it Why Us? Building production-grade data & AI solutions Your pipelines directly impact mission-critical and client-facing interactions Lean team, no red tape — build, own, ship Remote-first with async culture that respects your time Competitive comp and benefits Our Stack: Python, SQL, GCP/Azure/AWS, Spark, Kafka, Airflow, Databricks, Spark, dbt, Kubernetes, LangChain, LLMs How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 week ago
4.0 - 6.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Data Science @Dream Sports: Data Science at Dream Sports comprises seasoned data scientists thriving to drive value with data across all our initiatives. The team has developed state-of-the-art solutions for forecasting and optimization, data-driven risk prevention systems, Causal Inference and Recommender Systems to enhance product and user experience. We are a team of Machine Learning Scientists and Research Scientists with a portfolio of projects ranges from production ML systems that we conceptualize, build, support and innovate upon, to longer term research projects with potential game-changing impact for Dream Sports. This is a unique opportunity for highly motivated candidates to work on real-world applications of machine learning in the sports industry, with access to state-of-the-art resources, infrastructure, and data from multiple sources streaming from 250 million users and contributing to our collaboration with Columbia Dream Sports AI Innovation Center. Your Role: Executing clean experiments rigorously against pertinent performance guardrails and analysing performance metrics to infer actionable findings Developing and maintaining services with proactive monitoring and can incorporate best industry practices for optimal service quality and risk mitigation Breaking down complex projects into actionable tasks that adhere to set management practices and ensure stakeholder visibility Managing end-to-end lifecycle of large scale ML projects from data preparation, model training, deployment, monitoring, and upgradation of experiments Leveraging a strong foundation in ML, statistics, and deep learning to adeptly implement research-backed techniques for model development Staying abreast of the best ML practices and developments of the industry to mentor and guide team members Qualifiers: 4-6 years of experience in building, deploying and maintaining ML solutions Extensive experience with Python, Sql, Tensorflow/Pytorch and atleast one distributed data framework (Spark/Ray/Dask ) Working knowledge of Machine Learning, probability & statistics and Deep Learning Fundamentals Experience in designing end to end machine learning systems that work at scale About Dream Sports: Dream Sports is India’s leading sports technology company with 250 million users, housing brands such as Dream11 , the world’s largest fantasy sports platform, FanCode , a premier sports content & commerce platform and DreamSetGo , a sports experiences platform. Dream Sports is based in Mumbai and has a workforce of close to 1,000 ‘Sportans’. Founded in 2008 by Harsh Jain and Bhavit Sheth, Dream Sports’ vision is to ‘Make Sports Better’ for fans through the confluence of sports and technology. For more information: https://dreamsports.group/ Dream11 is the world’s largest fantasy sports platform with 230 million users playing fantasy cricket, football, basketball & hockey on it. Dream11 is the flagship brand of Dream Sports, India’s leading Sports Technology company and has partnerships with several national & international sports bodies and cricketers.
Posted 1 week ago
6.0 years
0 Lacs
Pune, Maharashtra, India
Remote
Position: Sr Data Operations Years of Experience – 6-8 Years Job Location: S.B Road –Pune, For other locations (Remote) The Position We are seeking a seasoned engineer with a passion for changing the way millions of people save energy. You’ll work within the Deliver and Operate team to build and improve our platforms to deliver flexible and creative solutions to our utility partners and end users and help us achieve our ambitious goals for our business and the planet. We are seeking a highly skilled and detail-oriented Software Engineer II for Data Operations team to maintain our data infrastructure, pipelines, and work-flows. You will play a key role in ensuring the smooth ingestion, transformation, validation, and delivery of data across systems. This role is ideal for someone with a strong understanding of data engineering and operational best practices who thrives in high-availability environments. Responsibilities & Skills You should: Monitor and maintain data pipelines and ETL processes to ensure reliability and performance. Automate routine data operations tasks and optimize workflows for scalability and efficiency. Troubleshoot and resolve data-related issues, ensuring data quality and integrity. Collaborate with data engineering, analytics, and DevOps teams to support data infrastructure. Implement monitoring, alerting, and logging systems for data pipelines. Maintain and improve data governance, access controls, and compliance with data policies. Support deployment and configuration of data tools, services, and platforms. Participate in on-call rotation and incident response related to data system outages or failures. Required Skills : 5+ years of experience in data operations, data engineering, or a related role. Strong SQL skills and experience with relational databases (e.g., PostgreSQL, MySQL). Proficiency with data pipeline tools (e.g., Apache Airflow). Experience with cloud platforms (AWS, GCP) and cloud-based data services (e.g., Redshift, BigQuery). Familiarity with scripting languages such as Python, Bash, or Shell. Knowledge of version control (e.g., Git) and CI/CD workflows. Qualifications Bachelor's degree in Computer Science, Engineering, Data Science, or a related field. Experience with data observability tools (e.g., Splunk, DataDog). Background in DevOps or SRE with focus on data systems. Exposure to infrastructure-as-code (e.g., Terraform, CloudFormation). Knowledge of streaming data platforms (e.g., Kafka, Spark Streaming).
Posted 1 week ago
1.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Roles and Responsibilities: ● Become a guardian of the SkilloVilla brand voice, ensuring our tone is consistent, inspiring, and clear across all written materials. ● Conduct in-depth research on industry-related topics and identify customer needs to develop original, impactful content for performance marketing ads, blogs, articles, social media, etc. ● Own our social media calendar, creating and scheduling compelling posts with influencers that spark conversation and build our community. ● Assist the marketing team in creating persuasive content for various campaigns and new initiatives. ● Leverage AI tools responsibly to assist in content ideation and creation, ensuring all output is high-quality and aligns with our brand standards. ● Proofread and edit all content meticulously to ensure grammatical accuracy, clarity, and correctness. ● Monitor and analyse content performance, using key metrics to measure impact and recommend new content strategies. Our Ideal Candidate Has: ● At least 1 year of experience in content creation or digital marketing. ● A compelling portfolio of content writing or copywriting experience that showcases your creativity and storytelling ability with a solid understanding of SEO best practices. ● Excellent written and verbal communication skills, with a keen eye for grammar and detail. ● Hands-on experience with AI content creation tools and an analytical mindset to make data-informed decisions. ● The ability to work autonomously as a self-starter, juggling competing priorities in a fast-paced environment. ● A Bachelor's degree in Communications, Marketing, English, Journalism, or a related field. ● A basic understanding of Data Analytics concepts and owning a social media account with decent followers is a plus. Compensation & Benefits: ● CTC range: ₹5,00,000 to ₹5,50,000 per annum ● A vibrant startup environment with significant learning and growth opportunities. ● The opportunity to work in a mission-driven company making a real impact.
Posted 1 week ago
2.0 years
0 Lacs
India
On-site
The Role We are hiring an AI/ML Developer (India), to join our India team, in support of a large global client! You will be responsible for developing, deploying, and maintaining AI and machine learning models. Your expertise in Python, cloud services, databases, and big data technologies will be instrumental in creating scalable and efficient AI applications. What You Will Be Doing •Develop, train, and deploy machine learning models for predictive analytics, classification, and clustering. •Implement AI-based solutions using frameworks such as TensorFlow, PyTorch, and Scikit-learn. •Work with cloud platforms including AWS (SageMaker, Lambda, S3), Azure, and Google Cloud (Vertex AI). •Integrate and fine-tune Hugging Face transformer models (e.g., BERT, GPT) for NLP tasks such as text classification, summarization, and sentiment analysis. •Develop AI automation solutions, including chatbot implementations using Microsoft Teams and Azure AI. •Work with big data technologies such as Apache Spark and Snowflake for large-scale data processing and analytics. •Design and optimize ETL pipelines for data quality management, transformation, and validation. •Utilize SQL, MySQL, PostgreSQL, and MongoDB for database management and query optimization. •Create interactive data visualizations using Tableau and Power BI to drive business insights. •Work with Large Language Models (LLMs) for AI-driven applications, including fine-tuning, training, and deploying model for conversational AI, text generation, and summarization. •Develop and implement Agentic AI systems, enabling autonomous decision-making AI agents that can adapt, learn, and optimize tasks in real-time. What You Bring Along •2+ years of experience applying AI to practical uses. •Strong programming skills in Python, SQL, and experience with ML frameworks such as TensorFlow, PyTorch, or Scikit-learn. •Knowledge of basic algorithms and object-oriented and functional design principles •Proficiency in using data analytics libraries like Pandas, NumPy, Matplotlib, and Seaborn. •Hands-on experience with cloud platforms such as AWS, Azure, and Google Cloud. •Experience with big data processing using Apache Spark and Snowflake. •Knowledge of NLP and AI model implementations using Hugging Face and cloud-based AI services. •Strong understanding of database management, query optimization, and data warehousing. •Experience with data visualization tools such as Tableau and Power BI. •Ability to work in a collaborative environment and adapt to new AI technologies. •Strong analytical and problem solving skills. Education: •Bachelor’s degree in computer science, Data Science, AI/ML, or a related field.
Posted 1 week ago
4.0 - 7.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Skills : Bigdata, Pyspark, Hive, Spark Optimization Good to have : GCP Experience: 4 to 7 years Roles & Responsibilities Skills : Bigdata, Pyspark, Hive, Spark Optimization Good to have : GCP
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Do you thrive in a dynamic environment and have a passion for tackling complex challenges in Risk Management? If your answer is yes then we have a good news for you 😄 We are currently helping our Big 4 client's Market Risk and Quantitative Risk teams in Mumbai, Bangalore, and Pune. Opportunities exist for experienced professionals in Market Risk Business Analysis, Quantitative Development (Python), and Big Data Engineering (Java/Spark/Hadoop). Join a globally recognized firm and contribute to impactful projects in a dynamic environment. Who We Are At Risk Inn, we bridge the gap between talented professionals in risk management and forward-thinking organizations worldwide via our initiative Global Careers club (GCC). Our mission extends beyond conventional hiring, connecting professionals from the US, India, Europe, Africa, and beyond. We foster an inclusive environment that empowers professionals in their career journeys, providing tailored support, unique opportunities, and a dedicated community. We help organizations find individuals equipped with not only the technical expertise but also the mindset required to excel in a dynamic financial landscape. The Opportunities We have three exciting openings for individuals with a strong foundation in market risk, quantitative analysis, and programming: Market Risk Business Analyst (4-8 years experience): You will be responsible for in-depth analysis of market risk exposures, Basel 2.5 and FRTB implementation, and supporting the development and implementation of risk management frameworks. A strong understanding of market risk concepts, Basel regulations, and FRTB is preferred. Python Developer (4-8 years experience): You will leverage your Python expertise to develop and maintain robust risk management tools and applications. Experience working with risk data and a passion for quantitative analysis are a plus. Java/Spark on Big Data - Hadoop Developer (5-8 years experience): You will play a key role in building and maintaining big data infrastructure using Java, Spark, and Hadoop technologies. Experience with server-side development and a strong understanding of big data concepts are essential. What You Will Bring A Master's degree (MBA, FRM, or a relevant quantitative field) is preferred. 4-8 years of experience in market risk analysis, quantitative analysis, or risk technology, depending on the position. Strong analytical and problem-solving skills. Excellent communication and collaboration skills. Proficiency in relevant programming languages (Python, Java, etc.) for the chosen role. A passion for learning and staying abreast of the latest developments in market risk management. What you will Gain Industry Expertise: Gain exposure to cutting-edge market risk practices and work alongside industry leaders. Meaningful Work: Contribute to projects that have a significant impact on our clients' financial stability. Career Growth: Develop your skills and expertise in a supportive and collaborative environment. Global Network: Be part of a world-class organization with a global presence. Ready to take the next step in your career? We encourage you to apply if you are a highly motivated individual with a passion for market risk and quantitative analysis. Please submit your resume to umar@riskinn.com Apply to make a real difference in the world of financial risk management!
Posted 1 week ago
2.0 - 4.0 years
0 Lacs
Navi Mumbai, Maharashtra, India
On-site
As a Graphic Designer, you're not just creating visuals – you're crafting experiences. At Pittie Group, you'll be the visual storyteller, shaping how our products are perceived and embraced. Collaborate with the Product team to define compelling product positioning, then bring those concepts to life across a dynamic range of platforms. Imagine: Conceptualizing campaigns: From initial spark to impactful execution, you'll own the visual narrative. Building brand presence: Your designs will breathe life into our brand across social media, websites, performance marketing, and even traditional offline channels. Crafting compelling narratives: Transform sales pitches and corporate presentations into engaging visual journeys. Designing immersive experiences: Contribute to the visual landscape of our events and experiential marketing initiatives. Your Toolkit: Master of Design: Your expertise spans the spectrum of digital and (occasionally) print, from social media graphics and website banners to presentations, brochures, and event signage. Brand Champion: You'll be the guardian of our brand identity, ensuring consistency in color palettes, typography, and overall style. Collaborative Spirit: Partner with Marketing, Sales, and Technology teams, seamlessly integrating feedback to refine and elevate your designs. Tech-Savvy Artist: Embrace the power of AI tools, digital optimization techniques, and collaborative platforms like Slack, Notion, Behance, Google Workspace, Canva, Gamma, and even ChatGPT extensions. Motion Graphics Enthusiast (Bonus!): Bring an extra dimension to your creations with simple animations and video loops for social media. Qualifications Bachelor's degree in Graphic Design or related field 2-4 years of experience in graphic design Proficient in Adobe Creative Suite Strong communication, conceptual thinking, typography skills and design skills Portfolio of work
Posted 1 week ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Company Qualcomm India Private Limited Job Area Engineering Group, Engineering Group > Software Engineering General Summary As a leading technology innovator, Qualcomm pushes the boundaries of what's possible to enable next-generation experiences and drives digital transformation to help create a smarter, connected future for all. As a Qualcomm Software Engineer, you will design, develop, create, modify, and validate embedded and cloud edge software, applications, and/or specialized utility programs that launch cutting-edge, world class products that meet and exceed customer needs. Qualcomm Software Engineers collaborate with systems, hardware, architecture, test engineers, and other teams to design system-level software solutions and obtain information on performance requirements and interfaces. Minimum Qualifications Bachelor's degree in Engineering, Information Systems, Computer Science, or related field. Senior Engineer: Job Title: Senior Machine Learning & Data Engineer We are looking for a highly skilled and experienced Machine Learning & Data Engineer to join our team. This hybrid role blends the responsibilities of a data engineer and a machine learning engineer, with a strong emphasis on Python development. You will be instrumental in designing scalable data pipelines, building and deploying ML/NLP models, and enabling data-driven decision-making across the organization. Key Responsibilities Data Engineering & Infrastructure Design and implement robust ETL pipelines and data integration workflows using SQL, NoSQL, and big data technologies (e.g., Spark, Hadoop). Optimize data storage and retrieval using relational and non-relational databases (e.g., PostgreSQL, MongoDB, Cassandra). Ensure data quality, validation, and governance across systems. Develop and maintain data models and documentation for data flows and architecture. Machine Learning & NLP Build, fine-tune, and deploy ML/NLP models using frameworks like TensorFlow, PyTorch, and Scikit-learn. Apply advanced NLP techniques including Transformers, BERT, and LLM fine-tuning. Implement Retrieval-Augmented Generation (RAG) pipelines using LangChain, LlamaIndex, and vector databases (e.g., FAISS, Milvus). Operationalize ML models using APIs, model registries (e.g., Hugging Face), and cloud services (e.g., SageMaker, Azure ML). Python Development Develop scalable backend services using Python frameworks such as FastAPI, Flask, or Django. Automate data workflows and model training pipelines using Python libraries (e.g., Pandas, NumPy, SQLAlchemy). Collaborate with cross-functional teams to integrate ML solutions into production systems. Collaboration & Communication Work closely with data scientists, analysts, and software engineers in Agile/Scrum teams. Translate business requirements into technical solutions. Maintain clean, well-documented code and contribute to knowledge sharing. Required Qualifications Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field. Proven experience in both data engineering and machine learning roles. Strong Python programming skills and experience with modern Python libraries and frameworks. Deep understanding of ML/NLP concepts and practical experience with LLMs and RAG architectures. Proficiency in SQL and experience with both SQL and NoSQL databases. Experience with big data tools (e.g., Spark, PySpark) and cloud platforms (AWS, Azure). Familiarity with data visualization tools like Power BI or Tableau. Excellent problem-solving, communication, and collaboration skills. Engineer: Job Title : Automation Engineer Job Description We are seeking a skilled and experienced Automation Engineer to join our team. As a C#/Python Developer, you will play a pivotal role in developing and deploying advanced solutions to drive our Product Test automation. You will collaborate closely with Testers, product managers, and stakeholders to ensure the successful implementation and operation of Automation solutions. The ideal candidate will have a strong background in API development with C# programming and python, with experience in deploying scalable solutions. Responsibilities Design, develop, and maintain core APIs using mainly C#. Collaborate with cross-functional teams to understand requirements and implement API solutions. Create and execute unit tests for APIs to ensure software quality. Identify, analyze, and troubleshoot issues in API development and testing. Continuously improve and optimize API development processes. Document API specifications, procedures, and results. Stay updated with the latest industry trends and technologies in API development. Requirements Bachelor's degree in Computer Science, Engineering, or related field. Proven experience in developing APIs and scripts/apps using C# and python. Knowledge in python is a plus. Experience in using visual studio for development Experience in wireless domain will be a plus Strong understanding of software testing principles and methodologies. Proficiency in C# programming language. Experience with Test Automation tools and best practices Familiarity with CI/CD pipelines and version control systems (e.g., Perforce). Excellent problem-solving skills and attention to detail. Strong communication and teamwork skills. Applicants : Qualcomm is an equal opportunity employer. If you are an individual with a disability and need an accommodation during the application/hiring process, rest assured that Qualcomm is committed to providing an accessible process. You may e-mail disability-accomodations@qualcomm.com or call Qualcomm's toll-free number found here. Upon request, Qualcomm will provide reasonable accommodations to support individuals with disabilities to be able participate in the hiring process. Qualcomm is also committed to making our workplace accessible for individuals with disabilities. (Keep in mind that this email address is used to provide reasonable accommodations for individuals with disabilities. We will not respond here to requests for updates on applications or resume inquiries). Qualcomm expects its employees to abide by all applicable policies and procedures, including but not limited to security and other requirements regarding protection of Company confidential information and other confidential and/or proprietary information, to the extent those requirements are permissible under applicable law. To all Staffing and Recruiting Agencies : Our Careers Site is only for individuals seeking a job at Qualcomm. Staffing and recruiting agencies and individuals being represented by an agency are not authorized to use this site or to submit profiles, applications or resumes, and any such submissions will be considered unsolicited. Qualcomm does not accept unsolicited resumes or applications from agencies. Please do not forward resumes to our jobs alias, Qualcomm employees or any other company location. Qualcomm is not responsible for any fees related to unsolicited resumes/applications. If you would like more information about this role, please contact Qualcomm Careers.
Posted 1 week ago
7.0 years
0 Lacs
Delhi, India
On-site
We are looking for a skilled Scala Developer with at least 7+ years of professional experience in building scalable, high-performance backend applications. The ideal candidate should have a strong grasp of functional programming, data processing frameworks, and cloud-based environments. Key Responsibilities:Design, develop, test, and deploy backend services and APIs using Scala.Collaborate with cross-functional teams including product managers, frontend developers, and QA engineers.Optimize and maintain existing codebases, ensuring performance, scalability, and reliability.Write clean, well-documented, and testable code following best practices.Work with tools and technologies like Akka, Play Framework, and Kafka.Participate in code reviews, knowledge sharing, and mentoring junior developers.Integrate with SQL/NoSQL databases and third-party APIs.Build and maintain data pipelines using Spark or similar tools (if required).
Posted 1 week ago
10.0 years
0 Lacs
Delhi, India
On-site
YOE : 10 YEARS TO 15 YEARS SKILLS REQUIRED : Java, python, HDFS, YARN, Map-Reduce, Hive, Kafka, Spark, Airflow, Presto, HLD, LLD, SQL, NOSQL, MongoDB, etc PREFERENCE : Tier 1 college/universities Role & Responsibilities Lead and mentor a team of data engineers, ensuring high performance and career growth. Architect and optimize scalable data infrastructure, ensuring high availability and reliability. Drive the development and implementation of data governance frameworks and best practices. Work closely with cross-functional teams to define and execute a data roadmap. Optimize data processing workflows for performance and cost efficiency. Ensure data security, compliance, and quality across all data platforms. Foster a culture of innovation and technical excellence within the data team. Ideal Candidate Candidates from TIER 1 college preferred MUST have Experience in Product startups , and should have implemented Data Engineering systems from an early stage in the Company MUST have 10+ years of experience in software/data engineering, with at least 3+ years in a leadership role. MUST have Expertise in backend development with programming languages such as Java, PHP, Python, Node.JS, GoLang, JavaScript, HTML, and CSS. MUST be Proficiency in SQL, Python, and Scala for data processing and analytics. Strong understanding of cloud platforms (AWS, GCP, or Azure) and their data services. MUST have Strong foundation and expertise in HLD and LLD, as well as design patterns, preferably using Spring Boot or Google Guice MUST have Experience in big data technologies such as Spark, Hadoop, Kafka, and distributed computing frameworks. Hands-on experience with data warehousing solutions such as Snowflake, Redshift, or BigQuery Deep knowledge of data governance, security, and compliance (GDPR, SOC2, etc.). Experience in NoSQL databases like Redis, Cassandra, MongoDB, and TiDB. Familiarity with automation and DevOps tools like Jenkins, Ansible, Docker, Kubernetes, Chef, Grafana, and ELK. Proven ability to drive technical strategy and align it with business objectives. Strong leadership, communication, and stakeholder management skills. Candidates from TIER 1 college preferred Preferred Qualifications: Experience in machine learning infrastructure or MLOps is a plus. Exposure to real-time data processing and analytics. Interest in data structures, algorithm analysis and design, multicore programming, and scalable architecture. Prior experience in a SaaS or high-growth tech company.
Posted 1 week ago
8.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Engineering at Innovaccer With every line of code, we accelerate our customers' success, turning complex challenges into innovative solutions. Collaboratively, we transform each data point we gather into valuable insights for our customers. Join us and be part of a team that's turning dreams of better healthcare into reality, one line of code at a time. Together, we’re shaping the future and making a meaningful impact on the world. About the Role We’re on a mission to completely change the way healthcare works by building the most powerful Healthcare Intelligence Platform (Gravity) ever made. Using an AI-first approach , our goal is to turn complicated health data into real-time insights that help hospitals, clinics, pharmaceutical companies, and researchers make faster, smarter decisions. We're building a unified platform from the ground up — specifically for healthcare . This platform will bring together everything from: Collecting data from different systems (Data Acquisition) Combining and cleaning it (Integration, Data Quality) Managing patient records and provider info (Master Data Management) Tagging and organizing it (Data Classification & Governance) Running analytics and building AI models (Analytics, AI Studio) Creating custom healthcare apps (App Marketplace) Using AI as a built-in assistant (AI as BI + Agent-first approach) This platform will let healthcare teams build solutions quickly — without starting from scratch each time. For example, they’ll be able to: Track and manage kidney disease patients across different hospitals Speed up clinical trials by analyzing real-world patient data Help pharmacies manage their stock better with predictive supply chain tools Detect early signs of diseases like diabetes or cancer with machine learning Ensure regulatory compliance automatically through built-in checks This is a huge, complex, and high-impact challenge , and we’re looking for a Staff Engineer to help lead the way. In this role, you’ll: Design and build scalable, secure, and reliable systems Create core features like data quality checks , metadata management , data lineage tracking , and privacy/compliance layers Work closely with other engineers, product managers, and healthcare experts to bring the platform to life If you're passionate about using technology to make a real difference in the world — and enjoy solving big engineering problems — we'd love to connect. A Day in the Life Architect, design, and build scalable data tools and frameworks. Collaborate with cross-functional teams to ensure data compliance, security, and usability. Lead initiatives around metadata management, data lineage, and data cataloging. Define and evangelize standards and best practices across data engineering teams. Own the end-to-end lifecycle of tooling – from prototyping to production deployment. Mentor and guide junior engineers and contribute to technical leadership across the organization. Drive innovation in privacy-by-design, regulatory compliance (e.g., HIPAA), and data observability solutions. What You Need 8+ years of experience in software engineering with strong experience building distributed systems. Proficient in backend development (Python, Java, or Scala or Go) and familiar with RESTful API design. Expertise in modern data stacks: Kafka, Spark, Airflow, Snowflake etc. Experience with open-source data governance frameworks like Apache Atlas, Amundsen, or DataHub is a big plus. Familiarity with cloud platforms (AWS, Azure, GCP) and their native data governance offerings. Bachelor's or Master’s degree in Computer Science, Engineering, or a related field. Here’s What We Offer Generous Leaves: Enjoy generous leave benefits of up to 40 days. Parental Leave : Leverage one of industry's best parental leave policies to spend time with your new addition. Sabbatical : Want to focus on skill development, pursue an academic career, or just take a break? We've got you covered. Health Insurance: We offer comprehensive health insurance to support you and your family, covering medical expenses related to illness, disease, or injury. Extending support to the family members who matter most. Care Program: Whether it’s a celebration or a time of need, we’ve got you covered with care vouchers to mark major life events. Through our Care Vouchers program, employees receive thoughtful gestures for significant personal milestones and moments of need. Financial Assistance : Life happens, and when it does, we’re here to help. Our financial assistance policy offers support through salary advances and personal loans for genuine personal needs, ensuring help is there when you need it most. Innovaccer is an equal-opportunity employer. We celebrate diversity, and we are committed to fostering an inclusive and diverse workplace where all employees, regardless of race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, marital status, or veteran status, feel valued and empowered. Disclaimer: Innovaccer does not charge fees or require payment from individuals or agencies for securing employment with us. We do not guarantee job spots or engage in any financial transactions related to employment. If you encounter any posts or requests asking for payment or personal information, we strongly advise you to report them immediately to our HR department at px@innovaccer.com. Additionally, please exercise caution and verify the authenticity of any requests before disclosing personal and confidential information, including bank account details. About Innovaccer Innovaccer activates the flow of healthcare data, empowering providers, payers, and government organizations to deliver intelligent and connected experiences that advance health outcomes. The Healthcare Intelligence Cloud equips every stakeholder in the patient journey to turn fragmented data into proactive, coordinated actions that elevate the quality of care and drive operational performance. Leading healthcare organizations like CommonSpirit Health, Atlantic Health, and Banner Health trust Innovaccer to integrate a system of intelligence into their existing infrastructure, extending the human touch in healthcare. For more information, visit www.innovaccer.com. Check us out on YouTube , Glassdoor , LinkedIn , Instagram , and the Web .
Posted 1 week ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Welcome to Warner Bros. Discovery… the stuff dreams are made of. Who We Are… When we say, “the stuff dreams are made of,” we’re not just referring to the world of wizards, dragons and superheroes, or even to the wonders of Planet Earth. Behind WBD’s vast portfolio of iconic content and beloved brands, are the storytellers bringing our characters to life, the creators bringing them to your living rooms and the dreamers creating what’s next… From brilliant creatives, to technology trailblazers, across the globe, WBD offers career defining opportunities, thoughtfully curated benefits, and the tools to explore and grow into your best selves. Here you are supported, here you are celebrated, here you can thrive. Your New Role Join one of the world’s largest and most innovative Media & Entertainment companies as a QA Engineer - AI/ML , where you’ll ensure the quality and reliability of cutting-edge AI and machine learning solutions that are transforming the industry. Our AI/ML-powered systems play a crucial role in optimizing content production, personalization, and consumer engagement, and your contributions will be instrumental in delivering seamless, scalable, and efficient solutions to our global audience. As a QA Engineer specializing in AI/ML, you’ll work closely with data scientists, software engineers, and product teams to rigorously test AI models, data pipelines, and machine learning platforms. Your role will involve defining and implementing robust quality assurance practices for complex AI systems, ensuring they meet the highest standards of performance, scalability, and security. If you are passionate about AI, testing, and driving quality excellence in a dynamic environment, this is your opportunity to make an impact. Quality Assurance for AI/ML Models and Pipelines Develop and execute comprehensive test plans, test cases, and test scripts to validate AI/ML models and data pipelines. Perform rigorous testing of machine learning models to ensure accuracy, reliability, and robustness under various scenarios. Validate data preprocessing, feature engineering, and model training pipelines for correctness and consistency. Identify and address performance bottlenecks in AI systems, ensuring scalability for large datasets and real-time applications. Collaborate with data scientists to validate model outputs and metrics against business requirements. Automation Testing and Tool Development Design and implement automated test frameworks and tools tailored for AI/ML workflows. Automate testing of model deployments, APIs, and data pipelines using industry-standard tools and frameworks. Create scripts to simulate edge cases, stress conditions, and user interactions for AI systems. Build monitoring tools to assess AI model drift, data inconsistencies, and system performance post-deployment. Continuously enhance automation coverage and testing efficiency through innovative practices. Collaboration and Cross-Functional Engagement Work closely with software engineers, data engineers, and product managers to align QA strategies with project goals. Participate in code reviews, design discussions, and sprint planning to incorporate QA perspectives early in the development lifecycle. Provide actionable feedback and insights to development teams to resolve issues and improve system quality. Support end-to-end integration testing of AI/ML solutions across multiple platforms and systems. Act as a quality advocate, promoting best practices for testing and validation within the organization. Governance, Compliance, and Reporting Ensure compliance with data privacy, security, and ethical AI standards during testing and deployment. Develop and maintain comprehensive documentation for QA processes, test cases, and system validations. Monitor and report on key QA metrics, including defect rates, coverage, and system reliability. Support regulatory audits and reviews by providing required testing documentation and evidence. Stay up-to-date with industry trends, tools, and practices in QA for AI/ML systems. Continuous Improvement and Innovation Research and adopt emerging technologies and frameworks for AI/ML testing and validation. Drive continuous improvement initiatives to enhance the efficiency and effectiveness of QA processes. Proactively identify and resolve quality gaps in AI/ML workflows, ensuring a seamless user experience. Contribute to building a culture of quality and accountability within the AI/ML team. Mentor junior team members on QA best practices and technical skills. Qualifications & Experiences Academic Qualifications: Bachelor’s or Master’s degree in Computer Science, Software Engineering, Data Science, or a related field. Certifications in software testing (e.g., ISTQB, CSTE) or machine learning (e.g., AWS ML, TensorFlow Developer) are a plus. Professional Experience 3+ years of experience in QA engineering, with at least 1+ years focused on testing AI/ML systems. Proven expertise in testing machine learning models, data pipelines, and cloud-based AI solutions. Hands-on experience with test automation tools and frameworks such as Selenium, PyTest, JUnit, or similar. Familiarity with CI/CD pipelines and tools like Jenkins, GitHub Actions, or Azure DevOps. Solid understanding of software testing methodologies, including functional, regression, performance, and stress testing. Technical Skills Proficiency in Python or Java, with experience in writing test scripts for AI/ML applications. Knowledge of AI/ML frameworks like TensorFlow, PyTorch, or Scikit-learn. Experience with testing tools specific to AI/ML, such as DeepChecks or Alibi. Familiarity with data querying languages (SQL) and data processing tools (Pandas, Spark). Understanding of cloud platforms (AWS, GCP, Azure) and their AI/ML services. Other Skills Strong analytical and problem-solving skills, with a detail-oriented approach. Excellent communication and collaboration skills, with the ability to work effectively in a cross-functional team. Passion for quality and a commitment to delivering exceptional results. Ability to adapt to a fast-paced and dynamic environment while maintaining a focus on continuous improvement. Self-motivated and proactive, with a drive to stay updated on industry trends and best practices. How We Get Things Done… This last bit is probably the most important! Here at WBD, our guiding principles are the core values by which we operate and are central to how we get things done. You can find them at www.wbd.com/guiding-principles/ along with some insights from the team on what they mean and how they show up in their day to day. We hope they resonate with you and look forward to discussing them during your interview. Championing Inclusion at WBD Warner Bros. Discovery embraces the opportunity to build a workforce that reflects a wide array of perspectives, backgrounds and experiences. Being an equal opportunity employer means that we take seriously our responsibility to consider qualified candidates on the basis of merit, regardless of sex, gender identity, ethnicity, age, sexual orientation, religion or belief, marital status, pregnancy, parenthood, disability or any other category protected by law. If you’re a qualified candidate with a disability and you require adjustments or accommodations during the job application and/or recruitment process, please visit our accessibility page for instructions to submit your request.
Posted 1 week ago
1.0 years
0 Lacs
Jaipur, Rajasthan, India
On-site
Job Information Employee Type Full Type Location: Lal Kothi, Jaipur Experience 1+ year(s) of experience Skill Kotlin,JAVA,JetPack Salary Up to ₹ 3 LPA + Laptop Reimbursement + Incentives Date Posted 2025-07-31 Job Description Greetings from Zucol Group of Companies! We are a group of creative and visionary techgeeks who strongly believe in the spark of technology and use it to provide the best experiences to our customers. We provide services like Academic services, ERP Solutions, App development, and many more. We have customers from all around the globe as we deal with international and domestic clients. We strongly believe in maintaining the relationships with the clients and our employees together with giving them their desired results. We work effectively and efficiently to provide our customers high standard services along with their trust in us. We are looking for a skilled Android Developer , details for the same are as follows: Roles And Responsibilities Develop and maintain Android apps from scratch. Own modules/features and lead independently. Collaborate with designers, backend, and QA teams. Perform code reviews and ensure clean, scalable code. Integrate REST APIs, SDKs, and manage Play Store activities. Fix bugs, optimize performance, and document code. Work in Agile setups (Jira, Trello, ClickUp). What We're Looking For Proficiency in Kotlin and/or Java. Hands-on experience with Jetpack, MVVM/MVP, Retrofit, and Firebase. Strong debugging, testing, and Git knowledge. Experience in building and publishing 2–3 live apps. Good knowledge of Dependency Injection (Hilt/Dagger), Room, and Coroutines. Apply Now
Posted 1 week ago
5.0 years
15 - 20 Lacs
Thiruvananthapuram, Kerala
On-site
Job Description: Designation: Senior Full Stack Developer (Python+ Angular + GCP/AWS/Azure) Qualification: Any UG / PG Degree / Computer / Engineering Graduates Experience: Min. 5+ Years Gender: Male / Female Job Location: Trivandrum / Kochi (KERALA) Job Type: Full Time | Day Shift | Permanent Job | Sat & Sun Week Off Working Time: 12:01 PM to 9:00 PM Project: European client | Shift: Mid Shift (12:01PM TO 9:00PM) | WFO Salary: Rs.15,00,000 to 20,00,000 LPA Introduction We are looking for a Senior Full stack (Python & Angular) Developer who will take ownership of building and maintaining complex backend systems, APIs, and applications using Python and for frontend with Angular Js. Profiles with BFSI - Payment system integrations experience is desired. Responsibilities include: Design, develop, and maintain backend applications, APIs, and services using Python. Write clean, maintainable, and scalable code following industry standards and best practices. Optimize application performance and ensure high availability and scalability. Review code and mentor junior developers to ensure code quality and foster knowledge sharing. Implement unit and integration tests to ensure application robustness. Set up and manage CI/CD pipelines using tools like Jenkins, GitLab CI, or CircleCI . Collaborate with DevOps to deploy applications on cloud platforms, preferably Google Cloud Platform (GCP). Design and build cloud-native applications using APIs, containers, and Kubernetes. Leverage GCP services to develop scalable and efficient solutions. Ensure application security, manage access controls, and comply with data privacy regulations. Work closely with frontend developers, DevOps engineers, and product managers for seamless project delivery. Design, manage, and optimize relational and NoSQL databases (PostgreSQL, MySQL, MongoDB). Monitor application performance using tools like Prometheus, Grafana, or Datadog. Build dynamic, responsive UIs using Angular and JavaScript . Develop and maintain reusable Angular components in collaboration with UX/UI teams . Primary Skills: 5+ years of experience as a Python developer, with a focus on Product development (BE+FE development). Hands on experience in Angular Js. Proven experience in designing and deploying scalable applications and microservices. App Integration experience is prefferd. Python- FastAPI (Flask/Django) API Development (RESTful Services) Cloud Platforms – Google Cloud Platform (GCP)prefferd. Familiarity with database management systems– PostgreSQL, MySQL, MongoDB and ORMs (e.g., SQLAlchemy, Django ORM). Knowledge of CI/CD pipelines – Jenkins, GitLab CI, CircleCI Frontend Development – JavaScript, Angular Code Versioning – Git Testing – Unit & Integration Testing Strong understanding of security principles, authentication (OAuth2, JWT), and data protection. Secondary Skills: Monitoring Tools – Prometheus, Grafana, Datadog Security and Compliance Standards – GDPR, PCI, Soc2 DevOps Collaboration UX/UI Collaboration for Angular components Experience with asynchronous programming (e.g., asyncio, Aiohttp). Experience with big data technologies like Spark or Hadoop. Experience with machine learning libraries (e.g., TensorFlow, PyTorch) is a plus. Job Types: Full-time, Permanent Pay: ₹1,500,000.00 - ₹2,000,000.00 per year Benefits: Health insurance Leave encashment Paid sick time Paid time off Provident Fund Schedule: Day shift Monday to Friday Supplemental Pay: Performance bonus Yearly bonus Work Location: In person
Posted 1 week ago
5.0 - 12.0 years
0 Lacs
Coimbatore, Tamil Nadu, India
On-site
Data Software Engineer Chennai & Coimbatore Walkin in on 2 Aug 25 Hybrid Role 5-12 Years of in Big Data & Data related technology experience Expert level understanding of distributed computing principles Expert level knowledge and experience in Apache Spark Hands on programming with Python Proficiency with Hadoop v2, Map Reduce, HDFS, Sqoop Experience with building stream-processing systems, using technologies such as Apache Storm or Spark-Streaming Good understanding of Big Data querying tools, such as Hive, and Impala Experience with integration of data from multiple data sources such as RDBMS (SQL Server, Oracle), ERP, Files Good understanding of SQL queries, joins, stored procedures, relational schemas Experience with NoSQL databases, such as HBase, Cassandra, MongoDB Knowledge of ETL techniques and frameworks Performance tuning of Spark Jobs Experience with AZURE Databricks Ability to lead a team efficiently Experience with designing and implementing Big data solutions Practitioner of AGILE methodology
Posted 1 week ago
0 years
0 Lacs
Delhi, India
On-site
Job Title: Azure Data Engineer Location: Noida Sec-132 Job Description: 1. Strong experience in Azure - Azussre Data Factory, Azure Data Lake, Azure Data bricks 2. Good at Cosmos DB, Azure SQL data warehouse/synapse 3. Excellent in data ingestion (Batch and real-time processing) 4. Good understanding of synapse workspace and synapse analytics 5. Good hands-on experience on Pyspark or Scala spark 6. Good hands-on experience on Delta Lake and Spark streaming 7. Good Understanding of Azure DevOps and Azure Infrastructure concepts 8. Have at least one project end-to-end hands-on implementation experience as an architect 9. Expert and persuasive communication skills (verbal and written) 10. Expert in presentation and skilled at managing multiple clients. 11. Good at Python / Shell Scripting 12. Good to have Azure/any cloud certifications.
Posted 1 week ago
2.0 years
0 Lacs
Greater Bengaluru Area
On-site
About the Company 6thStreet.com is an omnichannel fashion & lifestyle destination that offers 1400+ fashion & beauty brands in the UAE, KSA, Kuwait, Oman, Bahrain & Qatar. Customers can shop the latest on-trend outfits, shoes, bags, beauty essentials and accessories from international brands such as Tommy Hilfiger, Calvin Klein, Hugo, Marks & Spencers, Dune London, Charles & Keith, Aldo, Crocs, Birkenstock, Skechers, Levi’s, Nike, Adidas, Loreal and Inglot amongst many more. 6thStreet.com recently opened GCC’s first phygital store at Dubai Hills Mall; an innovative tech-led space which combines the best of both online & offline shopping with online browsing & smart fitting rooms. Overview The ML Engineer will extract insights and build models that will drive key business decisions. The candidate will work closely with other data scientists, software engineers and product managers to design, build, optimize and deploy machine learning systems and solutions. This role is ideal for someone with a strong analytical mindset, a passion for data, and a desire to grow in a fast-paced e-commerce environment. Necessary Skills Python: Proficiency in python, with knowledge of popular libraries like pandas, numpy, scipy, scikit-learn, tensorflow, pytorch SQL: Strong ability to write and optimize complex SQL queries to extract and manipulate large datasets from relational databases Data Analysis & Visualization: Ability to work with large datasets and extract meaningful insights and able to leverage data visualization tools and libraries Data Wrangling & Preprocessing: Expertise in cleaning and transforming raw data into structured formats Statistical Analysis: A solid understanding of descriptive and inferential statistics, including hypothesis testing and probability theory Machine Learning & Deep Learning: Familiarity with supervised and unsupervised learning algorithms such as regression, tree based methods, clustering, boosting and bagging methodologies Machine learning workflows: feature engineering, model training, model optimization , validation and evaluation ML Deployment: Deploying machine learning models to production environments, ensuring they meet the scalability, reliability, and performance requirements DevOps: Git, CI/CD pipelines, dockerization, model versioning (mlflow), monitoring platforms Cloud Platforms: Experience with cloud platforms like AWS, Google Cloud or Azure for deploying models Problem-Solving & Analytical Thinking: Ability to approach complex problems methodically and implement robust solutions Collaboration & Communication: Strong ability to work with cross-functional teams and communicate technical concepts to non-technical stakeholders. Adaptability & Learning: Willingness to quickly learn new tools, technologies, and algorithms Attention to Detail: Ability to carefully test and validate models, ensuring they work as intended in production Good to have: Familiarity with big data technologies such as Spark or Hadoop Object-oriented programming (OOP) Knowledge of data privacy and security practices when working with sensitive data Experience working with big data tools (e.g., Apache Kafka, Apache Flink) for streaming data processing Familiarity with feature stores like Feast Experience working with e-commerce data Responsibilities Design and implement machine learning models, algorithms, and systems Build and maintain end-to-end machine learning pipelines- model training, validation, and deployment Experiment with different algorithms and approaches to optimize model performance Collaborate with software engineers, product managers, analysts to build scalable, production-ready solutions Communicate complex technical concepts to non-technical stakeholders Stay updated with the latest advancements in machine learning and deep learning. Evaluate and experiment with new tools, libraries, and algorithms that could improve model performance Collaborate on proof-of-concept (POC) projects to validate new approaches and techniques Benefits Full-time role Competitive salary Company employee discounts across all brands Medical & health insurance Collaborative work environment Good vibes work culture Qualifications Bachelor's degree or equivalent experience in quantative field (Statistics, Mathematics, Computer Science, Engineering, etc.) At least 2 years' of experience in quantitative analytics or data modeling and development Deep understanding of predictive modeling, machine-learning, clustering and classification techniques, and algorithms Fluency in a programming language (Python, C,C++, Java, SQL)
Posted 1 week ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
About Us: Planet Spark is reshaping the EdTech landscape by equipping kids and young adults with future-ready skills like public speaking, and more. We're on a mission to spark curiosity, creativity, and confidence in learners worldwide. If you're passionate about meaningful impact, growth, and innovation—you're in the right place. Location: Gurgaon (On-site) Experience Level: Entry to Early Career (Freshers welcome!) Shift Options: Domestic | Middle East | International Working Days: 5 days/week (Wednesday & Thursday off) | Weekend availability required Target Joiners: Any (Bachelor’s or Master’s) 🔥 What You'll Be Owning (Your Impact): Lead Activation: Engage daily with high-intent leads through dynamic channels—calls, video consults, and more. Sales Funnel Pro: Own the full sales journey—from first hello to successful enrollment. Consultative Selling: Host personalized video consultations with parents/adult learners, pitch trial sessions, and resolve concerns with clarity and empathy. Target Slayer: Consistently crush weekly revenue goals and contribute directly to Planet Spark’s growth engine. Client Success: Ensure a smooth onboarding experience and transition for every new learner. Upskill Mindset: Participate in hands-on training, mentorship, and feedback loops to constantly refine your game. 💡 Why Join Sales at Planet Spark? Only Warm Leads: Skip the cold calls—our leads already know us and have completed a demo session. High-Performance Culture: Be part of a fast-paced, energetic team that celebrates success and rewards hustle. Career Fast-Track: Unlock rapid promotions, performance bonuses, and leadership paths. Top-Notch Training: Experience immersive onboarding, live role-plays, and access to ongoing L&D programs. Rewards & Recognition: Weekly shoutouts, cash bonuses, and exclusive events to celebrate your wins. Make Real Impact: Help shape the minds of tomorrow while building a powerhouse career today. 🎯 What You Bring to the Table: Communication Powerhouse: You can build trust and articulate ideas clearly in both spoken and written formats. Sales-Driven: You know how to influence decisions, navigate objections, and close deals with confidence. Empathy First: You genuinely care about clients’ goals and tailor your approach to meet them. Goal-Oriented: You’re self-driven, proactive, and hungry for results. Tech Fluent: Comfortable using CRMs, video platforms, and productivity tools. ✨ What’s in It for You? 💼 High-growth sales career with serious earning potential 🌱 Continuous upskilling in EdTech, sales, and communication 🧘 Supportive culture that values growth and well-being 🎯 Opportunity to work at the cutting edge of education innovation
Posted 1 week ago
0 years
0 Lacs
Greater Kolkata Area
On-site
About Us: Planet Spark is reshaping the EdTech landscape by equipping kids and young adults with future-ready skills like public speaking, and more. We're on a mission to spark curiosity, creativity, and confidence in learners worldwide. If you're passionate about meaningful impact, growth, and innovation—you're in the right place. Location: Gurgaon (On-site) Experience Level: Entry to Early Career (Freshers welcome!) Shift Options: Domestic | Middle East | International Working Days: 5 days/week (Wednesday & Thursday off) | Weekend availability required Target Joiners: Any (Bachelor’s or Master’s) 🔥 What You'll Be Owning (Your Impact): Lead Activation: Engage daily with high-intent leads through dynamic channels—calls, video consults, and more. Sales Funnel Pro: Own the full sales journey—from first hello to successful enrollment. Consultative Selling: Host personalized video consultations with parents/adult learners, pitch trial sessions, and resolve concerns with clarity and empathy. Target Slayer: Consistently crush weekly revenue goals and contribute directly to Planet Spark’s growth engine. Client Success: Ensure a smooth onboarding experience and transition for every new learner. Upskill Mindset: Participate in hands-on training, mentorship, and feedback loops to constantly refine your game. 💡 Why Join Sales at Planet Spark? Only Warm Leads: Skip the cold calls—our leads already know us and have completed a demo session. High-Performance Culture: Be part of a fast-paced, energetic team that celebrates success and rewards hustle. Career Fast-Track: Unlock rapid promotions, performance bonuses, and leadership paths. Top-Notch Training: Experience immersive onboarding, live role-plays, and access to ongoing L&D programs. Rewards & Recognition: Weekly shoutouts, cash bonuses, and exclusive events to celebrate your wins. Make Real Impact: Help shape the minds of tomorrow while building a powerhouse career today. 🎯 What You Bring to the Table: Communication Powerhouse: You can build trust and articulate ideas clearly in both spoken and written formats. Sales-Driven: You know how to influence decisions, navigate objections, and close deals with confidence. Empathy First: You genuinely care about clients’ goals and tailor your approach to meet them. Goal-Oriented: You’re self-driven, proactive, and hungry for results. Tech Fluent: Comfortable using CRMs, video platforms, and productivity tools. ✨ What’s in It for You? 💼 High-growth sales career with serious earning potential 🌱 Continuous upskilling in EdTech, sales, and communication 🧘 Supportive culture that values growth and well-being 🎯 Opportunity to work at the cutting edge of education innovation
Posted 1 week ago
0 years
0 Lacs
Greater Kolkata Area
On-site
About Us: Planet Spark is reshaping the EdTech landscape by equipping kids and young adults with future-ready skills like public speaking, and more. We're on a mission to spark curiosity, creativity, and confidence in learners worldwide. If you're passionate about meaningful impact, growth, and innovation—you're in the right place. Location: Gurgaon (On-site) Experience Level: Entry to Early Career (Freshers welcome!) Shift Options: Domestic | Middle East | International Working Days: 5 days/week (Wednesday & Thursday off) | Weekend availability required Target Joiners: Any (Bachelor’s or Master’s) 🔥 What You'll Be Owning (Your Impact): Lead Activation: Engage daily with high-intent leads through dynamic channels—calls, video consults, and more. Sales Funnel Pro: Own the full sales journey—from first hello to successful enrollment. Consultative Selling: Host personalized video consultations with parents/adult learners, pitch trial sessions, and resolve concerns with clarity and empathy. Target Slayer: Consistently crush weekly revenue goals and contribute directly to Planet Spark’s growth engine. Client Success: Ensure a smooth onboarding experience and transition for every new learner. Upskill Mindset: Participate in hands-on training, mentorship, and feedback loops to constantly refine your game. 💡 Why Join Sales at Planet Spark? Only Warm Leads: Skip the cold calls—our leads already know us and have completed a demo session. High-Performance Culture: Be part of a fast-paced, energetic team that celebrates success and rewards hustle. Career Fast-Track: Unlock rapid promotions, performance bonuses, and leadership paths. Top-Notch Training: Experience immersive onboarding, live role-plays, and access to ongoing L&D programs. Rewards & Recognition: Weekly shoutouts, cash bonuses, and exclusive events to celebrate your wins. Make Real Impact: Help shape the minds of tomorrow while building a powerhouse career today. 🎯 What You Bring to the Table: Communication Powerhouse: You can build trust and articulate ideas clearly in both spoken and written formats. Sales-Driven: You know how to influence decisions, navigate objections, and close deals with confidence. Empathy First: You genuinely care about clients’ goals and tailor your approach to meet them. Goal-Oriented: You’re self-driven, proactive, and hungry for results. Tech Fluent: Comfortable using CRMs, video platforms, and productivity tools. ✨ What’s in It for You? 💼 High-growth sales career with serious earning potential 🌱 Continuous upskilling in EdTech, sales, and communication 🧘 Supportive culture that values growth and well-being 🎯 Opportunity to work at the cutting edge of education innovation
Posted 1 week ago
10.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
We are seeking a visionary AI Architect to lead the design and integration of cutting-edge AI systems, including Generative AI , Large Language Models (LLMs) , multi-agent orchestration , and retrieval-augmented generation (RAG) frameworks. This role demands a strong technical foundation in machine learning, deep learning, and AI infrastructure, along with hands-on experience in building scalable, production-grade AI systems on the cloud. The ideal candidate combines architectural leadership with hands-on proficiency in modern AI frameworks, and can translate complex business goals into innovative, AI-driven technical solutions. Primary Stack & Tools: Languages : Python, SQL, Bash ML/AI Frameworks : PyTorch, TensorFlow, Scikit-learn, Hugging Face Transformers GenAI & LLM Tooling : OpenAI APIs, LangChain, LlamaIndex, Cohere, Claude, Azure OpenAI Agentic & Multi-Agent Frameworks : LangGraph, CrewAI, Agno, AutoGen Search & Retrieval : FAISS, Pinecone, Weaviate, Elasticsearch Cloud Platforms : AWS, GCP, Azure (preferred: Vertex AI, SageMaker, Bedrock) MLOps & DevOps : MLflow, Kubeflow, Docker, Kubernetes, CI/CD pipelines, Terraform, FAST API Data Tools : Snowflake, BigQuery, Spark, Airflow Key Responsibilities: Architect scalable and secure AI systems leveraging LLMs , GenAI , and multi-agent frameworks to support diverse enterprise use cases (e.g., automation, personalization, intelligent search). Design and oversee implementation of retrieval-augmented generation (RAG) pipelines integrating vector databases, LLMs, and proprietary knowledge bases. Build robust agentic workflows using tools like LangGraph , CrewAI , or Agno , enabling autonomous task execution, planning, memory, and tool use. Collaborate with product, engineering, and data teams to translate business requirements into architectural blueprints and technical roadmaps. Define and enforce AI/ML infrastructure best practices , including security, scalability, observability, and model governance. Manage technical road-map, sprint cadence, and 3–5 AI engineers; coach on best practices. Lead AI solution design reviews and ensure alignment with compliance, ethics, and responsible AI standards. Evaluate emerging GenAI & agentic tools; run proofs-of-concept and guide build-vs-buy decisions. Qualifications: 10+ years of experience in AI/ML engineering or data science, with 3+ years in AI architecture or system design. Proven experience designing and deploying LLM-based solutions at scale, including fine-tuning , prompt engineering , and RAG-based systems . Strong understanding of agentic AI design principles , multi-agent orchestration , and tool-augmented LLMs . Proficiency with cloud-native ML/AI services and infrastructure design across AWS, GCP, or Azure. Deep expertise in model lifecycle management, MLOps, and deployment workflows (batch, real-time, streaming). Familiarity with data governance , AI ethics , and security considerations in production-grade systems. Excellent communication and leadership skills, with the ability to influence technical and business stakeholders.
Posted 1 week ago
6.0 years
0 Lacs
Udaipur, Rajasthan, India
On-site
Role: Senior Data Engineer Experience: 4-6 Yrs Location: Udaipur , Jaipur Job Description: We are looking for a highly skilled and experienced Data Engineer with 4–6 years of hands-on experience in designing and implementing robust, scalable data pipelines and infrastructure. The ideal candidate will be proficient in SQL and Python and have a strong understanding of modern data engineering practices. You will play a key role in building and optimizing data systems, enabling data accessibility and analytics across the organization, and collaborating closely with cross-functional teams including Data Science, Product, and Engineering. Key Responsibilities: Design, develop, and maintain scalable ETL/ELT data pipelines using SQL and Python Collaborate with data analysts, data scientists, and product teams to understand data needs Optimize queries and data models for performance and reliability Integrate data from various sources, including APIs, internal databases, and third-party systems Monitor and troubleshoot data pipelines to ensure data quality and integrity Document processes, data flows, and system architecture Participate in code reviews and contribute to a culture of continuous improvement Required Skills: 4–6 years of experience in data engineering, data architecture, or backend development with a focus on data Strong command of SQL for data transformation and performance tuning Experience with Python (e.g., pandas, Spark, ADF) Solid understanding of ETL/ELT processes and data pipeline orchestration Proficiency with RDBMS (e.g., PostgreSQL, MySQL, SQL Server) Experience with data warehousing solutions (e.g., Snowflake, Redshift, BigQuery) Familiarity with version control (Git), CI/CD workflows, and containerized environments (Docker, Kubernetes) Basic Programming Skills Excellent problem-solving skills and a passion for clean, efficient data systems Preferred Skills: Experience with cloud platforms (AWS, Azure, GCP) and services like S3, Glue, Dataflow, etc. Exposure to enterprise solutions (e.g., Databricks, Synapse) Knowledge of big data technologies (e.g., Spark, Kafka, Hadoop) Background in real-time data streaming and event-driven architectures Understanding of data governance, security, and compliance best practices Prior experience working in agile development environment Educational Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. Visit us: https://kadellabs.com/ https://in.linkedin.com/company/kadel-labs https://www.glassdoor.co.in/Overview/Working-at-Kadel-Labs-EI_IE4991279.11,21.htm
Posted 1 week ago
6.0 years
0 Lacs
Jaipur, Rajasthan, India
On-site
JOB DESCRIPTION: DATA ENGINEER (Databricks & AWS) Overview: As a Data Engineer, you will work with multiple teams to deliver solutions on the AWS Cloud using core cloud data engineering tools such as Databricks on AWS, AWS Glue, Amazon Redshift, Athena, and other Big Data-related technologies. This role focuses on building the next generation of application-level data platforms and improving recent implementations. Hands-on experience with Apache Spark (PySpark, SparkSQL), Delta Lake, Iceberg, and Databricks is essential. Locations: Jaipur, Pune, Hyderabad, Bangalore, Noida. Responsibilities: • Define, design, develop, and test software components/applications using AWS-native data services: Databricks on AWS, AWS Glue, Amazon S3, Amazon Redshift, Athena, AWS Lambda, Secrets Manager • Build and maintain ETL/ELT pipelines for both batch and streaming data. • Work with structured and unstructured datasets at scale. • Apply Data Modeling principles and advanced SQL techniques. • Implement and manage pipelines using Apache Spark (PySpark, SparkSQL) and Delta Lake/Iceberg formats. • Collaborate with product teams to understand requirements and deliver optimized data solutions. • Utilize CI/CD pipelines with DBX and AWS for continuous delivery and deployment of Databricks code. • Work independently with minimal supervision and strong ownership of deliverables. Must Have: • 6+ years of experience in Data Engineering on AWS Cloud. • Hands-on expertise in: o Apache Spark (PySpark, SparkSQL) o Delta Lake / Iceberg formats o Databricks on AWS o AWS Glue, Amazon Athena, Amazon Redshift • Strong SQL skills and performance tuning experience on large datasets. • Good understanding of CI/CD pipelines, especially using DBX and AWS tools. • Experience with environment setup, cluster management, user roles, and authentication in Databricks. • Certified as a Databricks Certified Data Engineer – Professional (mandatory). Good To Have: • Experience migrating ETL pipelines from on-premise or other clouds to AWS Databricks. • Experience with Databricks ML or Spark 3.x upgrades. • Familiarity with Airflow, Step Functions, or other orchestration tools. • Experience integrating Databricks with AWS services in a secured, production-ready environment. • Experience with monitoring and cost optimization in AWS. Key Skills: • Languages: Python, SQL, PySpark • Big Data Tools: Apache Spark, Delta Lake, Iceberg • Databricks on AWS • AWS Services: AWS Glue, Athena, Redshift, Lambda, S3, Secrets Manager • Version Control & CI/CD: Git, DBX, AWS CodePipeline/CodeBuild • Other: Data Modeling, ETL Methodology, Performance Optimization
Posted 1 week ago
7.0 years
0 Lacs
India
Remote
Job Title: Scala Developer Experience: 7+ Years Location: Remote Employment Type: Full-Time Job Summary We are seeking an experienced Scala Developer with 7+ years of expertise in building scalable and high-performance backend applications. The ideal candidate should have a solid understanding of functional programming principles , distributed systems, and cloud-native environments. You will play a key role in designing and implementing robust backend services and collaborating closely with cross-functional teams. Key Responsibilities Design, develop, test, and deploy backend services and APIs using Scala. Collaborate with product managers, frontend developers, and QA teams to deliver high-quality software. Optimize and maintain existing codebases with a focus on performance, scalability, and reliability . Write clean, maintainable, and testable code , adhering to coding best practices and standards. Utilize tools and frameworks such as Akka, Play Framework, Kafka , and other Scala ecosystem technologies. Conduct code reviews , share knowledge, and mentor junior team members. Work with SQL/NoSQL databases and integrate third-party APIs. Build and maintain data processing pipelines using Spark or similar tools (if required). Key Skills and Qualifications 7+ years of experience in backend development with Scala . Strong knowledge of functional programming concepts. Hands-on experience with Akka, Play Framework, Kafka, Spark (preferred). Proficiency in working with databases (SQL and NoSQL) . Familiarity with cloud platforms (AWS, GCP, or Azure). Solid understanding of API design, distributed systems, and microservices architecture. Strong problem-solving and debugging skills. Excellent collaboration and communication abilities.
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15459 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France