Jobs
Interviews

17315 Spark Jobs - Page 13

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Skills: Database programming - SQL/ PL-SQL/ T-SQL ETL - Data pipeline, data preparation Analytics - BI Tool Roles & Responsibilities • Implement some of the world's largest data size big data analytics projects using Kyvos platcirm • Preparation of data for BI modeling using Spark, Hive, SQL and other ETL/ELT OLAP Data Modelling • Tuning of models for fastest and sub second query performance from business intelligense tools • Communicate with customer stake holders for busin

Posted 2 days ago

Apply

2.5 years

0 Lacs

Mumbai Metropolitan Region

On-site

JD - Senior Influencer Marketing Executive About Slidein Media We are a leading Influencer Marketing Firm. At our agency, marketing isn't just a job—it's an art form. We’re all about creating next-level campaigns that turn heads, spark conversations, and break through the noise. From partnering with top-tier influencers to collaborating with innovative brands, we’re in the business of building brands that people actually care about. Job Summary The Senior Influencer Marketing role is responsible for planning, implementing, and managing influencer marketing strategies to enhance brand awareness, engage with target audiences, and drive business results. This role involves identifying and building relationships with influencers, creating and executing campaigns, analysing performance metrics, and providing exceptional client servicing. This includes handling client details, briefing clients and influencers on campaign progress, and ensuring clients satisfaction. Roles and responsibilities Identify and build relationships with relevant influencers across various niches. Plan, execute, and manage influencer marketing campaigns, ensuring alignment with client goals. Handle client details, providing regular updates and detailed campaign reports. Maintain strong, long-term relationships with clients and influencers. Monitor campaign deliverables, timelines, brand briefs and budgets for successful execution. Negotiate compensation and terms with influencers for cost-effectiveness. Stay informed about industry trends and identify new influencer partnership opportunities. Ensure client satisfaction through exceptional communication and service. Experience - 2.5+ years Location - Mumbai (Malad West) Interested candidates can share your resume at priyanka.kundaikar@slideinmedia.com / connect@slideinmedia.com If you love turning creative ideas into viral sensations, managing projects with ninja-level precision, and working with a team that’s as passionate as you are about driving results—this is the place for you. We're all about timelines, budgets, and hitting the ground running (but we promise, it never gets boring).

Posted 2 days ago

Apply

2.0 - 4.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

At ZoomInfo, we encourage creativity, value innovation, demand teamwork, expect accountability and cherish results. We value your take charge, take initiative, get stuff done attitude and will help you unlock your growth potential. One great choice can change everything. Thrive with us at ZoomInfo. About the role : As a Data Services Analyst II you will report to a Data Services Manager and will be responsible for analyzing business information from the perspective of marketing and sales professionals in order to ensure that ZoomInfo continues to deliver the highest quality data products to our customers. Demonstrate the value of ZoomInfo data through problem-solving, knowledge sharing and data deliverables. We are looking for a data whizz who can effectively communicate, solve complex data problems, and possess a strong understanding of the value of our data. What You'll Do: Data Analysis Apply quantitative analysis and data visualization to tell the story behind the numbers all while supporting data-driven decision making Use technical skills, problem solving and business knowledge to deliver custom datasets to clients that meet or exceed their expectations Implement proactive improvements to processes and methods for gathering and aggregating data. Find creative solutions to problems when limited information is available Business Operations Understand all aspects of ZoomInfo data including all of our applications and tools Create and maintain documentation on internal and client facing business processes Drive internal process improvement to better service client needs Identify opportunities to reduce manual tasks through automation and create operational efficiencies Client Management Define business requirements needs and document rules and logic for use in client implementations Ability to understand and solve qualitative problems and present or explain solutions to an audience using top-quality, audience-appropriate communication. Enable clients to maximize the benefits of their ZoomInfo partnership through best practices, innovative thinking and process improvement What You Bring: Experience : Ideal candidate will have 2-4 years of experience in a technology setting Education : A Bachelors in a quantitative/analytical field (Mathematics, Statistics, Engineering, Computer Science, Economics) Shift - Night Shift (5PM IST to 2AM IST / 7PM IST to 4AM IST) Mandatory skills : Expert in SQL, Python, Microsoft Excel (formulas, pivot tables) and data analysis/visualization tools Preferred : Tableau, Spark, Snowflake or similar technologies and tools Must have proven track record in technology delivery, process improvement, data governance and or client services Proven Ability to work and interact in a fast-paced environment and strong multitasking, organizational and time management skills Highly resourceful and a go-getter attitude Highly organized and careful attention to detail Excellent communication skills. About us: ZoomInfo (NASDAQ: GTM) is the Go-To-Market Intelligence Platform that empowers businesses to grow faster with AI-ready insights, trusted data, and advanced automation. Its solutions provide more than 35,000 companies worldwide with a complete view of their customers, making every seller their best seller. ZoomInfo may use a software-based assessment as part of the recruitment process. More information about this tool, including the results of the most recent bias audit, is available here. ZoomInfo is proud to be an equal opportunity employer, hiring based on qualifications, merit, and business needs, and does not discriminate based on protected status. We welcome all applicants and are committed to providing equal employment opportunities regardless of sex, race, age, color, national origin, sexual orientation, gender identity, marital status, disability status, religion, protected military or veteran status, medical condition, or any other characteristic protected by applicable law. We also consider qualified candidates with criminal histories in accordance with legal requirements. For Massachusetts Applicants: It is unlawful in Massachusetts to require or administer a lie detector test as a condition of employment or continued employment. An employer who violates this law shall be subject to criminal penalties and civil liability. ZoomInfo does not administer lie detector tests to applicants in any location.

Posted 2 days ago

Apply

12.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Visa is a world leader in payments and technology, with over 259 billion payments transactions flowing safely between consumers, merchants, financial institutions, and government entities in more than 200 countries and territories each year. Our mission is to connect the world through the most innovative, convenient, reliable, and secure payments network, enabling individuals, businesses, and economies to thrive while driven by a common purpose – to uplift everyone, everywhere by being the best way to pay and be paid. Make an impact with a purpose-driven industry leader. Join us today and experience Life at Visa. Job Description Team Summary Visa Consulting and Analytics (VCA) drives tangible, impactful and financial results for Visa’s network clients, including both financial services and merchants. Drawing on our expertise in strategy consulting, data analytics, brand management, marketing, operational and macroeconomics, Visa Consulting and Analytics solves the most strategic problems for our clients. The India & South Asia (INSA) Consulting Market team within Visa Consulting & Analytics provides consulting and solution services for Visa’s largest issuers in India, Sri Lanka, Bangladesh, Nepal, Bhutan & Maldives. We apply deep expertise in the payments industry to provide solutions to assist clients with their key business priorities, drive growth and improve profitability. The VCA team provides a comprehensive range of consulting services to deliver solutions that address unique challenges in areas such as improving profitability, strategic growth, customer experience, digital payments and running risk. The individual will be part of VCA Data Science geographic team cluster of India and South Asia (INSA) markets and will be responsible for sales and delivery of data science and analytics based solutions to Visa Clients. What the Director Data Science, Visa Consulting & Analytics does at Visa: The Director, Data Science at Visa Consulting & Analytics (VCA) blends technical expertise with business acumen to deliver impactful, data-driven solutions to Visa’s clients, shaping the future of payments through analytics and innovation. This role combines hands-on modeling with strategic leadership, leading the adoption of Generative AI (Gen AI) and Agentic AI into Visa’s offerings. This is onsite role based out of Mumbai. The role will require travel. Key Responsibilities Commercial Acumen/Business Development Collaborate with internal and external clients to comprehend their strategic business inquiries, leading project scoping and design to effectively address those questions by leveraging Visa's data. Drive revenue outcomes for VCA, particularly focusing on data science offerings such as ML Model solutions , data collaboration, and managed service verticals within data science. Technical Leadership Design, develop, and implement advanced analytics and machine learning models to solve complex business challenges for Visa’s clients leveraging Visanet data as well as Client Data Drive the integration and adoption of Gen AI and Agentic AI technologies within Visa’s data science offerings. Ensure the quality, performance, and scalability of data-driven solutions. Strategic Business Impact Translate client needs and business challenges into actionable data science projects that deliver measurable value. Collaborate with cross-functional teams including Consulting, Sales, Product, and Data Engineering to align analytics solutions with business objectives. Present insights and recommendations to both technical and non-technical stakeholders. Team Leadership & Development Mentor and manage a team of data scientists and analysts, fostering a culture of innovation, collaboration, and continuous learning. Set priorities, provide technical direction, and oversee the end-to-end delivery of analytics projects. Innovation & Best Practices Stay abreast of emerging trends in AI and data science, particularly in Gen AI and Agentic AI. Champion the adoption of new methodologies and tools to enhance Visa’s analytics capabilities and value to clients. Represent VCA as a thought leader in internal and external forums. This is a hybrid position. Expectation of days in office will be confirmed by your Hiring Manager. Qualifications Basic Qualifications: • Advanced degree (MS/PhD) in Computer Science, Statistics, Mathematics, Engineering, or a related filed from Tier-1 institute e.g. IIT, ISI, DSE, IISc, etc. • 12+ years of experience in data science, analytics, or related fields, including 3 + years in a leadership/management role. • Proven track record of building and leading high-performing data science teams. • Expertise in statistical analysis, machine learning, data mining, and predictive modeling. • Proficiency in programming languages such as Python, R, or Scala, and experience with ML frameworks (e.g., scikit-learn, TensorFlow, PyTorch). • Excellent communication, presentation, and stakeholder management skills. Preferred Qualifications: • Exposure/prior work experience in payments and/or banking industry • Experience in consulting space or matrix team structure • Familiarity with cloud platforms (AWS, Azure, GCP) and big data technologies (Spark, Hadoop). • Publication or conference experience in the data science/AI community. Additional Information Visa is an EEO Employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability or protected veteran status. Visa will also consider for employment qualified applicants with criminal histories in a manner consistent with EEOC guidelines and applicable local law.

Posted 2 days ago

Apply

4.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Overview We are seeking a Platform Architect with expertise in Informatica PowerCenter and Informatica Intelligent Cloud Services (IICS) to design, implement, and optimize enterprise-level data integration platforms. The ideal candidate will have a strong background in ETL/ELT architecture, cloud data integration, and platform modernization, ensuring scalability, security, and performance across on-prem and cloud environments. Responsibilities Platform Engineering & Administration Oversee installation, configuration, and optimization of PowerCenter and IICS environments. Manage platform scalability, performance tuning, and troubleshooting. Implement data governance, security, and compliance (e.g., role-based access, encryption, data lineage tracking). Optimize connectivity and integrations with various sources (databases, APIs, cloud storage, SaaS apps). Cloud & Modernization Initiatives Architect and implement IICS-based data pipelines for real-time and batch processing. Migrate existing PowerCenter workflows to IICS, leveraging serverless and cloud-native features. Ensure seamless integration with cloud platforms (AWS, Azure, GCP) and modern data lakes/warehouses (Snowflake, Redshift, BigQuery). Qualifications 4 years of experience in data integration and ETL/ELT architecture. Expert-level knowledge of Informatica PowerCenter and IICS (Cloud Data Integration, API & Application Integration, Data Quality). Hands-on experience with cloud platforms (AWS, Azure, GCP) and modern data platforms (Snowflake, Databricks, Redshift, BigQuery). Strong SQL, database tuning, and performance optimization skills. Deep understanding of data governance, security, and compliance best practices. Experience in automation, DevOps (CI/CD), and Infrastructure-as-Code (IaC) tools for data platforms. Excellent communication, leadership, and stakeholder management skills. Preferred Qualifications Informatica certifications (IICS, PowerCenter, Data Governance). Proficient to Power Center to IDMC Conversions Understanding on real-time streaming (Kafka, Spark Streaming). Knowledge of API-based integration and event-driven architectures. Familiarity with Machine Learning and AI-driven data processing.

Posted 2 days ago

Apply

0.0 - 1.0 years

0 Lacs

India

On-site

A few weeks ago, an event organizer stumbled upon our Email Campaigns tool. They set up a campaign, hit send… and sold out their event in 48 hours. No support. No onboarding. No playbook. And that’s when we realized something big: 🧠 The product works. Now it’s time to tell its story to the world. 🚀 So now we’re hiring a Product Marketing Intern at AllEvents And not just any intern — someone crazy enough to turn features into talk of the town. We’re looking for: Marketing Intern with Tech Knowledge Experience: 0-1 year A storyteller who loves simplifying the complex A growth hacker mindset who’s hungry to experiment A builder who wants to own features beyond the launch As a Product Marketing Ninja, you’ll: - Craft compelling product narratives that spark action - Design onboarding experiences and in-app communication - Run experiments across landing pages, email flows, and campaigns - Collaborate directly with the product and marketing team to shape how our tools are experienced 🎓 This is a perfect fit for someone who: - Thinks like a builder, writes like a storyteller - Loves working across teams and turning ideas into experiments - Wants to make a real impact — not just learn, but ship

Posted 2 days ago

Apply

5.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

What you’ll do? Design, develop, and operate high scale applications across the full engineering stack. Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Research, create, and develop software applications to extend and improve on Equifax Solutions. Manage sole project priorities, deadlines, and deliverables. Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What experience you need? Bachelor's degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in Java & SQL 2+ years experience with Cloud technology: GCP, AWS, or Azure 2+ years experience designing and developing cloud-native solutions 2+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 3+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart? Knowledge or experience with Apache Beam for stream and batch data processing. Familiarity with big data tools and technologies like Apache Kafka, Hadoop, or Spark. Experience with containerization and orchestration tools (e.g., Docker, Kubernetes). Exposure to data visualization tools or platforms.

Posted 2 days ago

Apply

5.0 years

0 Lacs

Indore, Madhya Pradesh, India

On-site

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Apache Spark Good to have skills : NA Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide effective solutions, ensuring that the data infrastructure is robust and scalable to meet the demands of the organization. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Mentor junior team members to enhance their skills and knowledge in data engineering. Professional & Technical Skills: - Must To Have Skills: Proficiency in Apache Spark. - Strong understanding of data pipeline architecture and design. - Experience with ETL processes and data integration techniques. - Familiarity with data warehousing concepts and technologies. - Knowledge of data quality frameworks and best practices. Additional Information: - The candidate should have minimum 7.5 years of experience in Apache Spark. - This position is based in Chennai. - A 15 years full time education is required.

Posted 2 days ago

Apply

175.0 years

0 Lacs

Gurugram, Haryana, India

On-site

At American Express, our culture is built on a 175-year history of innovation, shared values and Leadership Behaviors, and an unwavering commitment to back our customers, communities, and colleagues. As part of Team Amex, you’ll experience this powerful backing with comprehensive support for your holistic well-being and many opportunities to learn new skills, develop as a leader, and grow your career. Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express Join Team Amex and let’s lead the way together. Business Overview: Credit and Fraud Risk (CFR) team helps drive profitable business growth by reducing the risk of fraud and maintaining industry the lowest credit loss rates. It applies an array of tools and ever-evolving technology to detect and combat fraud, minimize the disruption of good spending and provide a world-class customer experience. The team leads efforts that leverage data and digital advancements to improve risk management as well as enable commerce and bring innovation. A single decision can have many outcomes. And when that decision affects millions of cardmembers and merchants, it needs to be the right one. That’s where AiDa Product team comes in who is part of the Credit & Fraud Risk (CFR) Global Data Science (GDS) CoE. The product specializes in powering a seamless unified experience for its AI / ML users and responsibly leverage enterprise data assets to support critical decisions for the company. As a part of this team, you’ll have the opportunity to work in one of the best talent of product owners and managers in the industry. You will solve real world business problems while getting exposure to the industry’s top leaders in AI / ML product management, decision science and technology. If you are passionate in getting to know all areas of our business and can translate business needs into remarkable solutions that can impact millions, you should consider a career in Product teams. Job Responsibilities · Contribute to defining and articulation of long-term AI product strategy and roadmap with clearly defined business metrics and outcomes. · Solve complicated business problems by prioritization & ownership of products and solutions to meet business objectives. · Prioritize and manage product backlogs, by balancing the requirements of partners and stakeholders. Evaluate prospective features in AI Products pipeline against changing requirements in the direction of AI adoptions. · Contribute to all product lifecycle processes including market (external) research, competitive analysis, planning, positioning, roadmap development, requirements finalization and product development. · Translate product roadmap into well defined requirements and acceptance test criteria. · Drive end-to-end ML/AI product developments with a team of engineers and designers. Transform MVPs to production grade capabilities in collaboration with engineering teams · Contribute to ideation and launch of innovative ML/AI products and capabilities. Innovate ways to evangelize product to drive Amex wide user adoption · (For Learn): Curate and Deliver technical trainings in AI, Cloud, Hive and Spark for beginners to advance level users · Create POCs for best in class AI-ML innovative products with the potential to scale Qualifications and Skills Required: · Strong quantitative, analytical, and structured problem-solving skills. · Strong technical background in AI / ML with background on python, SQL, data analytics and data visualization · Familiarity with ML model development lifecycle (MDLC): feature selection and engineering, different ML model algorithm families - Decision Trees, Boosting algorithms, optimizations considerations for ML models, deployment and serving challenges · Knowledge of Google Cloud Platform (GCP), Big Query, GCP AI / ML capabilities such as Vertex AI. · Knowledge of Big Data Platforms such as Hadoop and PySpark. · Knowledge of designing and building big data tools and frameworks · Demonstrate creativity and self-sufficiency along with strong interpersonal/ collaborative skills and experience working in global teams · Understanding of the various ML Model deployment systems and processes with a basic knowledge of various model regulatory and governance policies. · Ability to prioritize well, communicate clearly and compellingly and understand how to drive a high level of focus and excellence with a strong, talented opinionated engineering, UX and QA teams · Knowledge of Notebook based IDE for performing AI / ML tasks such as Jupyter and Airflow. · Familiarity with product management tools such as Rally, JIRA, and Confluence · Excellent verbal and written communications skills · Undergraduate/Master’s in Computer Science / Information Technology / Mathematics from institutes of global repute. Primary Job Location : Gurugram Hybrid – depending on business requirements We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations.

Posted 2 days ago

Apply

7.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Role: Data Engineer Experience: 7+ Years Mode: Hybrid Key Responsibilities: • Design and implement enterprise-grade Data Lake solutions using AWS (e.g., S3, Glue, Lake Formation). • Define data architecture patterns, best practices, and frameworks for handling large-scale data ingestion, storage, computing and processing. • Optimize cloud infrastructure for performance, scalability, and cost-effectiveness. • Develop and maintain ETL pipelines using tools such as AWS Glue or similar platforms. CI/CD Pipelines managing in DevOps. • Create and manage robust Data Warehousing solutions using technologies such as Redshift. • Ensure high data quality and integrity across all pipelines. • Design and deploy dashboards and visualizations using tools like Tableau, Power BI, or Qlik. • Collaborate with business stakeholders to define key metrics and deliver actionable insights. • Implement best practices for data encryption, secure data transfer, and role-based access control. • Lead audits and compliance certifications to maintain organizational standards. • Work closely with cross-functional teams, including Data Scientists, Analysts, and DevOps engineers. • Mentor junior team members and provide technical guidance for complex projects. • Partner with stakeholders to define and align data strategies that meet business objectives. Qualifications & Skills: • Strong experience in building Data Lakes using AWS Cloud Platforms Tech Stack. • Proficiency with AWS technologies such as S3, EC2, Glue/Lake Formation (or EMR), Quick sight, Redshift, Athena, Airflow (or) Lambda + Step Functions + Event Bridge, Data and IAM. • Expertise in AWS tools that includes Data Lake Storage, Compute, Security and Data Governance. • Advanced skills in ETL processes, SQL (like Cloud SQL, Aurora, Postgres), NoSQL DB’s (like DynamoDB, MongoDB, Cassandra) and programming languages (e.g., Python, Spark, or Scala). Real-time streaming applications preferably in Spark, Kafka, or other streaming platforms. • AWS Data Security: Good Understanding of security concepts such as: Lake formation, IAM, Service roles, • Encryption, KMS, Secrets Manager. • Hands-on experience with Data Warehousing solutions and modern architectures like Lakehouse’s or Delta Lake. Proficiency in visualization tools such as Tableau, Power BI, or Qlik. • Strong problem-solving skills and ability to debug and optimize application applications for performance. • Strong understanding of Database/SQL for database operations and data management. • Familiarity with CI/CD pipelines and version control systems like Git. • Strong understanding of Agile methodologies and working within scrum teams. Preferred Qualifications: • Bachelor of Engineering degree in Computer Science, Information Technology, or a related field. • AWS Certified Solutions Architect – Associate (required). • Experience with Agile/Scrum methodologies and design sprints.

Posted 2 days ago

Apply

5.0 years

0 Lacs

Delhi, India

On-site

Job Title : MLOPs Engineer NP- Max 30 days Location: Gurgaon/ Bangalore/Noida/Pune Job Description: 5+ Years of prior experience in Data Engineering and MLOPs. 3+ Years of strong exposure in deploying and managing data science pipelines in production environments. Strong proficiency in Python programming language. Experience with Spark/PySpark and distributed computing frameworks. Hands-on experience with CI/CD pipelines and automation tools. Exposure in deploying a use case in production leveraging Generative AI involving prompt engineering and RAG Framework Familiarity with Kafka or similar messaging systems. Strong problem-solving skills and the ability to iterate and experiment to optimize AI model behavior. Excellent problem-solving skills and attention to detail. Ability to communicate effectively with diverse clients/stakeholders. Education Background: Bachelor’s or master’s degree in computer science, Engineering, or a related field. Tier I/II candidates preferred.

Posted 2 days ago

Apply

10.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

About the role We’re looking for Senior Engineering Manager to lead our Data / AI Platform and MLOps teams at slice. In this role, you’ll be responsible for building and scaling a high-performing team that powers data infrastructure, real-time streaming, ML enablement, and data accessibility across the company. You'll partner closely with ML, product, platform, and analytics stakeholders to build robust systems that deliver high-quality, reliable data at scale. You will drive AI initiatives to centrally build AP platform and apps which can be leveraged by various functions like legal, CX, product in a secured manner This is a hands-on leadership role perfect for someone who enjoys solving deep technical problems while growing people and teams. What You Will Do Lead and grow the data platform pod focused on all aspects of data (batch + real-time processing, ML platform, AI tooling, Business reporting, Data products – enabling product experience through data) Maintain hands-on technical leadership - lead by example through code reviews, architecture decisions, and direct technical contribution Partner closely with product and business stakeholders to identify data-driven opportunities and translate business requirements into scalable data solutions Own the technical roadmap for our data platform including infra modernization, performance, scalability, and cost efficiency Drive the development of internal data products like self-serve data access, centralized query layers, and feature stores Build and scale ML infrastructure with MLOps best practices including automated pipelines, model monitoring, and real-time inference systems Lead AI platform development for hosting LLMs, building secure AI applications, and enabling self-service AI capabilities across the organization Implement enterprise AI governance including model security, access controls, and compliance frameworks for internal AI applications Collaborate with engineering leaders across backend, ML, and security to align on long-term data architecture Establish and enforce best practices around data governance, access controls, and data quality Ensure regulatory compliance with GDPR, PCI-DSS, SOX through automated compliance monitoring and secure data pipelines Implement real-time data processing for fraud detection and risk management with end-to-end encryption and audit trails Coach engineers and team leads through regular 1:1s, feedback, and performance conversations What You Will Need 10+ years of engineering experience, including 2+ years managing data or infra teams with proven hands-on technical leadership Strong stakeholder management skills with experience translating business requirements into data solutions and identifying product enhancement opportunities Strong technical background in data platforms, cloud infrastructure (preferably AWS), and distributed systems Experience with tools like Apache Spark, Flink, EMR, Airflow, Trino/Presto, Kafka, and Kubeflow/Ray plus modern stack: dbt, Databricks, Snowflake, Terraform Hands on experience building AI/ML platforms including MLOps tools and experience with LLM hosting, model serving, and secure AI application development Proven experience improving performance, cost, and observability in large-scale data systems Expert-level cloud platform knowledge with container orchestration (Kubernetes, Docker) and Infrastructure-as-Code Experience with real-time streaming architectures (Kafka, Redpanda, Kinesis) Understanding of AI/ML frameworks (TensorFlow, PyTorch), LLM hosting platforms, and secure AI application development patterns Comfort working in fast-paced, product-led environments with ability to balance innovation and regulatory constraints Bonus: Experience with data security and compliance (PII/PCI handling), LLM infrastructure, and fintech regulations Life at slice Life so good, you’d think we’re kidding: Competitive salaries. Period. An extensive medical insurance that looks out for our employees & their dependents. We’ll love you and take care of you, our promise. Flexible working hours. Just don’t call us at 3AM, we like our sleep schedule. Tailored vacation & leave policies so that you enjoy every important moment in your life. A reward system that celebrates hard work and milestones throughout the year. Expect a gift coming your way anytime you kill it here. Learning and upskilling opportunities. Seriously, not kidding. Good food, games, and a cool office to make you feel like home. An environment so good, you’ll forget the term “colleagues can’t be your friends”.

Posted 2 days ago

Apply

0 years

0 Lacs

India

Remote

Ignite the Future: Google Ads Specialist for NIO’s Global EV Campaign iProspect and Power NIO’s Electric Revolution! Role : Google Ads Specialist / Performance Marketing Partner Location : Fully Remote – Open to Global Talent Engagement : Project-Based or Monthly Retainer Industry : Electric Vehicles / Smart Mobility / Automotive Application : Submit CV and portfolio via LinkedIn Why This Opportunity? iProspect is partnering with NIO , a trailblazer in smart electric vehicles, to drive a high-octane digital growth campaign across multiple global markets. We’re on the hunt for a Google Ads Specialist or performance agency to fuel NIO’s expansion with electrifying campaigns that spark lead generation, test drive signups, and pre-orders. If you’re passionate about performance marketing and ready to accelerate a world-class EV brand, this is your chance to make an impact! What You’ll Do As our Google Ads Specialist, you’ll take the wheel of high-stakes campaigns, driving results for NIO’s cutting-edge EV lineup. Your mission includes: ⚡️ Launch High-Voltage Campaigns : Design, manage, and optimize Google Ads across Search, Performance Max, Display, and YouTube, targeting key markets like India, UAE, France, Singapore, Germany, and the USA. ⚡️ Target EV Enthusiasts : Conduct in-depth keyword and audience research to reach high-intent EV buyers. ⚡️ Maximize Performance : Optimize for ROAS, CPA, and conversions using smart bidding and advanced attribution strategies. ⚡️ Collaborate on Creativity : Partner with our creative team to craft compelling ad copy, landing pages, and visuals that convert. ⚡️ Deliver Actionable Insights : Provide clear, data-driven performance reports with recommendations to keep campaigns charging forward. Who We’re Looking For We need a performance marketing pro who thrives on delivering results. Here’s what we’re seeking: ✅ Proven Expertise : Hands-on experience managing Google Ads budgets of $5,000+/month, with a strong portfolio in e-commerce, mobility, or similar industries. ✅ Technical Mastery : Deep knowledge of Performance Max, smart bidding, and attribution models. ✅ Strategic Mindset : Ability to analyze data and optimize campaigns for maximum impact. ✅ Bonus Points : Experience in EV, clean tech, or automotive sectors, and fluency in additional languages beyond English. ✅ Communication Skills : Fluent in English, with the ability to collaborate seamlessly with global teams. Why You’ll Love This Project 💰 Competitive Compensation : Flexible pay structure (fixed fee or percentage of ad spend), with performance bonuses tied to KPIs like ROAS and lead quality. 🌍 Global Impact : Work on a high-profile campaign for a leading EV brand across multiple dynamic markets. 🚀 Creative Freedom : Collaborate with a world-class team to shape innovative campaigns for a forward-thinking industry. ⚙️ Cutting-Edge Tools : Access top-tier platforms to elevate your performance marketing game. Ready to Accelerate NIO’s Growth? We’re looking for bold, results-driven talent to join this electrifying project! To apply, send us: 📄 Your CV, highlighting experience with high-budget campaigns. 📊 A portfolio showcasing your best work in performance marketing. 📩 Submit via LinkedIn or the provided email address : hr@iprospectcreative.com About iProspect & NIO iProspect is a global growth partner for innovative brands, specializing in market expansion, creative strategy, and performance marketing. Our collaboration with NIO , a pioneer in smart electric vehicles, is redefining mobility through bold campaigns and cutting-edge technology. Let’s charge toward a sustainable future together! Apply now and let’s make sparks fly!

Posted 2 days ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Join us as a Solution Architect This is an opportunity for an experienced Solution Architect to help us define the high-level technical architecture and design for a key data analytics and insights platform that powers the personalised customer engagement initiatives of the business You’ll define and communicate a shared technical and architectural vision of end-to-end designs that may span multiple platforms and domains Take on this exciting new challenge and hone your technical capabilities while advancing your career and building your network across the bank We're offering this role at vice president level What you'll do We’ll look to you to influence and promote the collaboration across platform and domain teams on the solution delivery. Partnering with platform and domain teams, you’ll elaborate the solution and its interfaces, validating technology assumptions, evaluating implementation alternatives, and creating the continuous delivery pipeline. You’ll also provide analysis of options and deliver end-to-end solution designs using the relevant building blocks, as well as producing designs for features that allow frequent incremental delivery of customer value. On Top Of This, You’ll Be Owning the technical design and architecture development that aligns with bank-wide enterprise architecture principles, security standards, and regulatory requirements Participating in activities to shape requirements, validating designs and prototypes to deliver change that aligns with the target architecture Promoting adaptive design practices to drive collaboration of feature teams around a common technical vision using continuous feedback Making recommendations of potential impacts to existing and prospective customers of the latest technology and customer trends Engaging with the wider architecture community within the bank to ensure alignment with enterprise standards Presenting solutions to governance boards and design review forums to secure approvals Maintaining up-to-date architectural documentation to support audits and risk assessment The skills you'll need As a Solution Architect, you’ll bring expert knowledge of application architecture, and in business data or infrastructure architecture with working knowledge of industry architecture frameworks such as TOGAF or ArchiMate. You’ll also need an understanding of Agile and contemporary methodologies with experience of working in Agile teams. A certification in cloud solutions like AWS Solution Architect is desirable while an awareness of agentic AI based application architectures using LLMs like OpenAI and agentic frameworks like LangGraph, CrewAI will be advantageous. Furthermore, You’ll Need Strong experience in solution design, enterprise architecture patterns, and cloud-native applications including the ability to produce multiple views to highlight different architectural concerns A familiarity with understanding big data processing in the banking industry Hands-on experience in AWS services, including but not limited to S3, Lambda, EMR, DynamoDB and API Gateway An understanding of big data processing using frameworks or platforms like Spark, EMR, Kafka, Apache Flink or similar Knowledge of real-time data processing, event-driven architectures, and microservices Conceptual understanding of data modelling and analytics, machine learning or deep-learning models The ability to communicate complex technical concepts clearly to peers and leadership level colleagues

Posted 2 days ago

Apply

8.0 - 12.0 years

0 Lacs

Vadodara, Gujarat, India

On-site

Responsibilities / Tasks Responsible for production planning and materials management with an aim to meet customer delivery schedule with optimum lead-time, inventory and utilization of workshop capacity. Define project schedule according to established manufacturing sequence and lead time. Periodic progress review, monitoring, tracking and updating projects progress as per plan. Proactively identify schedule and cost variations and take necessary actions. Organize and manage review meetings with internal stakeholders, group customers and manage correspondence. Identifying and resolving issues that arise during the project lifecycle. Monitor and align availability of inputs (drawing & materials) as per workshop loading plan. Advance planning and procurement of long-lead items to meet customer delivery schedule. Sub-contracting planning and procurement as per delivery schedule. Study build package/drawing set and define procurement strategy for all materials and accordingly define the material master in SAP. Study build package/drawing set and create multi-level manufacturing bill of material (BOM) in SAP according to procurement strategy and manufacturing sequence. Create Project, WBS structure and generate demands in SAP. Do material requirement planning (MRP) and generate purchase requisitions and planned orders. Release production orders for in-house manufacturing items. Allocation of available materials to project and utilization of inventory. Manage revision of build package and accordingly update the schedule, BOM, production orders and timely communicate to all stakeholders. Establish and monitor SAP parameters including safety stock and maintain optimum inventory of raw material and long lead items to achieve customer delivery requirements. Co-ordination with cross functions for smooth execution of assigned projects. Packing and dispatch planning and preparation of related documents. Contribute to various organization initiatives related to Lean, 5S, SOC, BBS, ISO, Digitalization, New Product Development, Lead Time Reduction etc. Maintain trustworthy relationships with all stakeholders and group customers. Experience and knowledge of SS equipment fabrication for Dairy, Pharma and Food applications. Your Profile / Qualifications Degree or Diploma in Mechanical/Fabrication/Production Engineering with 8 to 12 years of experience preferably in production planning in fabrication industries. Broad knowledge and understanding of production planning and materials management in project driven make to order manufacturing environment. Working knowledge of project planning software MS Project and SAP PP, PS & MM Modules. Should be familiar with operational excellence tools like Lean, 5S, Gemba, Kaizen and ISO 9001, 14001 & 45001. Should have the ability to manage assigned projects / tasks independently. Positive mindset, quick learner, team player and customer centric approach. Strong analytical and problem-solving skills. Strong communication skills in English. Did we spark your interest? Then please click apply above to access our guided application process.

Posted 2 days ago

Apply

0.0 - 2.0 years

0 - 0 Lacs

Delhi, Delhi

On-site

We're Hiring: IT Recruiter (2 to 5 Years Experience) Are you a tech-savvy recruiter with a passion for finding the right talent in a fast-paced IT world? We’re looking for someone just like you! What You’ll Do: * Partner with hiring managers to understand job requirements and team dynamics * Source & screen candidates via LinkedIn, portals, referrals, and internal databases * Conduct initial technical assessments for role suitability * Build strong pipelines across key tech domains: Programming: Java, Python, .NET, JavaScript, Node.js, React, Angular Cloud: AWS, Azure, GCP DevOps: Jenkins, Docker, Kubernetes, Terraform, Ansible Data: SQL, NoSQL, Hadoop, Spark, Power BI, Tableau ERP/CRM: SAP, Salesforce Testing: Manual, Automation, Selenium, API Others: Finacle, Murex, Oracle, Unix, PLSQL * Coordinate interviews & ensure smooth candidate experience * Maintain ATS records accurately * Share market insights with hiring managers * Constantly refine sourcing strategies based on trends and data What We’re Looking For: * Bachelor’s degree (technical background a plus) * 2 to 5 years of IT recruitment experience (corporate/agency) * Strong knowledge of tech stacks & IT hiring practices * Excellent communication & stakeholder management * A sharp eye for both technical and cultural fit * Proficiency in ATS, job portals, and LinkedIn Recruiter Apply Now: hr@virtueevarsity.com / 9958100227 Let’s connect and build something impactful together! Job Type: Permanent Pay: ₹30,000.00 - ₹40,000.00 per month Benefits: Health insurance Provident Fund Application Question(s): Work Location - Bangalore, Bhopal and Delhi Experience: IT Recruiter: 2 years (Required) Work Location: In person

Posted 2 days ago

Apply

7.0 years

0 Lacs

Gurgaon Rural, Haryana, India

On-site

Minimum of 7+ years of experience in the data analytics field. Proven experience with Azure/AWS Databricks in building and optimizing data pipelines, architectures, and datasets. Strong expertise in Scala or Python, PySpark, and SQL for data engineering tasks. Ability to troubleshoot and optimize complex queries on the Spark platform. Knowledge of structured and unstructured data design, modelling, access, and storage techniques. Experience designing and deploying data applications on cloud platforms such as Azure or AWS. Hands-on experience in performance tuning and optimizing code running in Databricks environments. Strong analytical and problem-solving skills, particularly within Big Data environments. Experience with Big Data management tools and technologies including Cloudera, Python, Hive, Scala, Data Warehouse, Data Lake, AWS, Azure. Technical and Professional Skills: Must Have: Excellent communication skills with the ability to interact directly with customers. Azure/AWS Databricks. Python / Scala / Spark / PySpark. Strong SQL and RDBMS expertise. HIVE / HBase / Impala / Parquet. Sqoop, Kafka, Flume. Airflow.

Posted 2 days ago

Apply

5.0 - 7.0 years

25 - 28 Lacs

Pune, Maharashtra, India

On-site

Job Description We are looking for a Big Data Engineer who will work on building, and managing Big Data Pipelines for us to deal with the huge structured data sets that we use as an input to accurately generate analytics at scale for our valued Customers. The primary focus will be on choosing optimal solutions to use for these purposes, then maintaining, implementing, and monitoring them. You will also be responsible for integrating them with the architecture used across the company. Core Responsibilities Design, build, and maintain robust data pipelines (batch or streaming) that process and transform data from diverse sources. Ensure data quality, reliability, and availability across the pipeline lifecycle. Collaborate with product managers, architects, and engineering leads to define technical strategy. Participate in code reviews, testing, and deployment processes to maintain high standards. Own smaller components of the data platform or pipelines and take end-to-end responsibility. Continuously identify and resolve performance bottlenecks in data pipelines. Take initiatives, and show the drive to pick up new stuff proactively, and work as a Senior Individual contributor on the multiple products and features we have. Required Qualifications 5 to 7 years of experience in Big Data or data engineering roles. JVM based languages like Java or Scala are preferred. For someone having solid Big Data experience, Python would also be OK. Proven and demonstrated experience working with distributed Big Data tools and processing frameworks like Apache Spark or equivalent (for processing), Kafka or Flink (for streaming), and Airflow or equivalent (for orchestration). Familiarity with cloud platforms (e.g., AWS, GCP, or Azure), including services like S3, Glue, BigQuery, or EMR. Ability to write clean, efficient, and maintainable code. Good understanding of data structures, algorithms, and object-oriented programming. Tooling & Ecosystem Use of version control (e.g., Git) and CI/CD tools. Experience with data orchestration tools (Airflow, Dagster, etc.). Understanding of file formats like Parquet, Avro, ORC, and JSON. Basic exposure to containerization (Docker) or infrastructure-as-code (Terraform is a plus). Skills: airflow,pipelines,data engineering,scala,ci,python,flink,aws,data orchestration,java,kafka,gcp,parquet,orc,azure,cd,dagster,ci/cd,git,avro,terraform,json,docker,apache spark,big data

Posted 2 days ago

Apply

3.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

About Us: Paytm is India's leading mobile payments and financial services distribution company. Pioneer of the mobile QR payments revolution in India, Paytm builds technologies that help small businesses with payments and commerce. Paytm’s mission is to serve half a billion Indians and bring them to the mainstream economy with the help of technology. Job Summary: Build systems for collection & transformation of complex data sets for use in production systems Collaborate with engineers on building & maintaining back-end services Implement data schema and data management improvements for scale and performance Provide insights into key performance indicators for the product and customer usage Serve as team's authority on data infrastructure, privacy controls and data security Collaborate with appropriate stakeholders to understand user requirements Support efforts for continuous improvement, metrics and test automation Maintain operations of live service as issues arise on a rotational, on-call basis Verify whether data architecture meets security and compliance requirements and expectations .Should be able to fast learn and quickly adapt at rapid pace. java/scala, SQL, Minimum Qualifications: Bachelor's degree in computer science, computer engineering or a related field, or equivalent experience 3+ years of progressive experience demonstrating strong architecture, programming and engineering skills. Firm grasp of data structures, algorithms with fluency in programming languages like Java, Python, Scala. Strong SQL language and should be able to write complex queries. Strong Airflow like orchestration tools. Demonstrated ability to lead, partner, and collaborate cross functionally across many engineering organizations Experience with streaming technologies such as Apache Spark, Kafka, Flink. Backend experience including Apache Cassandra, MongoDB and relational databases such as Oracle, PostgreSQL AWS/GCP solid hands on with 4+ years of experience. Strong communication and soft skills. Knowledge and/or experience with containerized environments, Kubernetes, docker. Experience in implementing and maintained highly scalable micro services in Rest, Spring Boot, GRPC. Appetite for trying new things and building rapid POCs" Key Responsibilities : Design, develop, and maintain scalable data pipelines to support data ingestion, processing, and storage Implement data integration solutions to consolidate data from multiple sources into a centralized data warehouse or data lake Collaborate with data scientists and analysts to understand data requirements and translate them into technical specifications Ensure data quality and integrity by implementing robust data validation and cleansing processes Optimize data pipelines for performance, scalability, and reliability. Develop and maintain ETL (Extract, Transform, Load) processes using tools such as Apache Spark, Apache NiFi, or similar technologies .Monitor and troubleshoot data pipeline issues, ensuring timely resolution and minimal downtimeImplement best practices for data management, security, and complianceDocument data engineering processes, workflows, and technical specificationsStay up-to-date with industry trends and emerging technologies in data engineering and big data. Compensation: If you are the right fit, we believe in creating wealth for you with enviable 500 mn+ registered users, 25 mn+ merchants and depth of data in our ecosystem, we are in a unique position to democratize credit for deserving consumers & merchants – and we are committed to it. India’s largest digital lending story is brewing here. It’s your opportunity to be a part of the story!

Posted 2 days ago

Apply

6.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

About the Role: This position requires someone to work on complex technical projects and closely work with peers in an innovative and fast-paced environment. For this role, we require someone with a strong product design sense & specialized in Hadoop and Spark technologies. Requirements: Minimum 6-8 years of experience in Big Data technologies. The position Grow our analytics capabilities with faster, more reliable tools, handling petabytes of data every day. Brainstorm and create new platforms that can help in our quest to make available to cluster users in all shapes and forms, with low latency and horizontal scalability. Make changes to our diagnosing any problems across the entire technical stack. Design and develop a real-time events pipeline for Data ingestion for real-time dash- boarding. Develop complex and efficient functions to transform raw data sources into powerful, reliable components of our data lake. Design & implement new components and various emerging technologies in Hadoop Eco- System, and successful execution of various projects. Be a brand ambassador for Paytm – Stay Hungry, Stay Humble, Stay Relevant! Preferred Qualification : Bachelor's/Master's Degree in Computer Science or equivalent Skills that will help you succeed in this role: Fluent with Strong hands-on experience with Hadoop, MapReduce, Hive, Spark, PySpark etc. Excellent programming/debugging skills in Python/Java/Scala. Experience with any scripting language such as Python, Bash etc. Good to have experience of working with noSQL databases like Hbase, Cassandra. Hands-on programming experience with multithreaded applications. Good to have experience in Database, SQL, messaging queues like Kafka. Good to have experience in developing streaming applications e.g. Spark Streaming, Flink, Storm, etc. Good to have experience with AWS and cloud technologies such as S3Experience with caching architectures like Redis etc. Why join us: Because you get an opportunity to make a difference, and have a great time doing that. You are challenged and encouraged here to do stuff that is meaningful for you and for those we serve. You should work with us if you think seriously about what technology can do for people. We are successful, and our successes are rooted in our people's collective energy and unwavering focus on the customer, and that's how it will always be. To know more about exiting work we do:https://paytm.com/blog/engineering/ Compensation: If you are the right fit, we believe in creating wealth for you with enviable 500 mn+ registered users, 21 mn+ merchants and depth of data in our ecosystem, we are in a unique position to democratize credit for deserving consumers & merchants – and we are committed to it. India’s largest digital lending story is brewing here. It’s your opportunity to be a part of the story!

Posted 2 days ago

Apply

6.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Data Engineering – Technical Lead About Us: Paytm is India’s leading digital payments and financial services company, which is focused on driving consumers and merchants to its platform by offering them a variety of payment use cases. Paytm provides consumers with services like utility payments and money transfers, while empowering them to pay via Paytm Payment Instruments (PPI) like Paytm Wallet, Paytm UPI, Paytm Payments Bank Netbanking, Paytm FASTag and Paytm Postpaid - Buy Now, Pay Later. To merchants, Paytm offers acquiring devices like Soundbox, EDC, QR and Payment Gateway where payment aggregation is done through PPI and also other banks’ financial instruments. To further enhance merchants’ business, Paytm offers merchants commerce services through advertising and Paytm Mini app store. Operating on this platform leverage, the company then offers credit services such as merchant loans, personal loans and BNPL, sourced by its financial partners. About the Role: This position requires someone to work on complex technical projects and closely work with peers in an innovative and fast-paced environment. For this role, we require someone with a strong product design sense & specialized in Hadoop and Spark technologies. Requirements: Minimum 6+ years of experience in Big Data technologies. The position Grow our analytics capabilities with faster, more reliable tools, handling petabytes of data every day. Brainstorm and create new platforms that can help in our quest to make available to cluster users in all shapes and forms, with low latency and horizontal scalability. Make changes to our diagnosing any problems across the entire technical stack. Design and develop a real-time events pipeline for Data ingestion for real-time dash- boarding. Develop complex and efficient functions to transform raw data sources into powerful, reliable components of our data lake. Design & implement new components and various emerging technologies in Hadoop Eco- System, and successful execution of various projects. Be a brand ambassador for Paytm – Stay Hungry, Stay Humble, Stay Relevant! Skills that will help you succeed in this role: Fluent with Strong hands-on experience with Hadoop, MapReduce, Hive, Spark, PySpark etc. Excellent programming/debugging skills in Python/Scala. Experience with AWS services such as S3, EMR, Glue, Athena etc. Experience with Kafka. Experience with SQL. Experience with Jira, bitbucket, Jenkins. Experience with any scripting language such as Python, Bash etc. Good to have experience of working with noSQL databases like Hbase, Cassandra. Good to have hands-on programming experience with multithreaded applications. Good to have experience in developing streaming applications e.g. Spark Streaming, Flink, Storm, etc. Why join us Because you get an opportunity to make a difference, and have a great time doing that. You are challenged and encouraged here to do stuff that is meaningful for you and for those we serve. You should work with us if you think seriously about what technology can do for people. We are successful, and our successes are rooted in our people's collective energy and unwavering focus on the customer, and that's how it will always be. Compensation: If you are the right fit, we believe in creating wealth for you with enviable 500 mn+ registered users, 21 mn+ merchants and depth of data in our ecosystem, we are in a unique position to democratize credit for deserving consumers & merchants – and we are committed to it. India’s largest digital lending story is brewing here. It’s your opportunity to be a part of the story!

Posted 2 days ago

Apply

1.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Be the spark that brightens days and ignite your career with TTEC’s award-winning employment experience. As a Chat Customer Service Representative working on site in Ahmedabad, Gujarat (Opp. L.J. Group of Institutes, Off S.G. Highway, Makarba), you’ll be a part of bringing humanity to business. #experienceTTEC Apply in-person for immediate interview - Monday to Friday - 10:30 AM to 4:00 PM Interested in Relocating? Virtual interviews accepted as well What You’ll Be Doing Do you have a passion for helping others and giving them peace of mind? In this role, you'll work to resolve customer issues via chat services including chat, text, email, social media, direct messaging as well as other nonverbal platforms. Whether it’s getting answers for customers quickly, consulting on products with compassion or resolving their issues with a smile, you’ll be the difference between their customer experience being just average or an exceptional one You'll report to Team Lead. You’ll contribute to the success of the customer experience as well as the overall success of the team. During a Typical Day, You’ll Answer incoming communications from customers Connect and resolve issues with customers using written communication only What You Bring To The Role 1 year or more customer service experience – Freshers welcome to apply Great written communication skills including grammar and spelling High School Diploma Computer savvy Flexibility to work in a 24/7 environment What You Can Expect Knowledgeable, encouraging, supporting and present leadership Diverse and community minded organization Career-growth and lots of learning opportunities for aspiring minds And yes...all the competitive compensation, performance bonus opportunities, and benefits you'd expect and maybe a few that would pleasantly surprise you A Bit More About Your Role We’ll train you to be a subject matter expert in your field, so you can be confident in providing the highest level of service possible whether through voice, chat or email interactions. We trust you already have the necessary ingredient that can’t be taught – a caring and supportive nature that will shine through as you help customers. You’ll also have a chance to make great new friends within the TTEC community and grow your career in a dynamic, family-friendly atmosphere. You'll report to Team Lead. You’ll contribute to the success of the customer experience as well as the overall success of the team. About TTEC Our business is about making customers happy. That’s all we do. Since 1982, we’ve helped companies build engaged, pleased, profitable customer experiences powered by our combination of humanity and technology. On behalf of many of the world’s leading iconic and disruptive brands, we talk, message, text, and video chat with millions of customers every day. These exceptional customer experiences start with you. TTEC is proud to be an equal opportunity employer where all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. TTEC embraces and is committed to building a diverse and inclusive workforce that respects and empowers the culture and perspectives within our global teams. We strive to reflect the communities we serve by not only delivering amazing service and technology, but also humanity. We make it a point to make sure all our employees feel valued and comfortable being their authentic selves at work. As a global company, we know diversity is our strength. It enables us to view projects and ideas from different vantage points and allows every individual to bring value to the table in their own unique way. Primary Location India-Gujarat-Ahmedabad Job _Customer Care Representative

Posted 2 days ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Greetings from BCforward INDIA TECHNOLOGIES PRIVATE LIMITED. Contract To Hire(C2H) Role Location: Gurgaon Payroll: BCforward Work Mode: Hybrid JD Skills: Big Data; ETL - Big Data / Data Warehousing; GCP; Adobe Experience Manager (AEM) Primary Skills : GCP, Adobe suit (like AEP, CJA, CDP), SQL, Big data, Python Secondary Skills : Airflow, Hive, Spark, Unix Shell scripting , Data warehousing concept Please share your Updated Resume, PAN card soft copy, Passport size Photo & UAN History. Interested applicants can share updated resume to g.sreekanth@bcforward.com Note: Looking for Immediate to 30-Days joiners at most. All the best 👍

Posted 2 days ago

Apply

0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

About Us: Planet Spark is reshaping the EdTech landscape by equipping kids and young adults with future-ready skills like public speaking, and more. We're on a mission to spark curiosity, creativity, and confidence in learners worldwide. If you're passionate about meaningful impact, growth, and innovation—you're in the right place. Location: Gurgaon (On-site) Experience Level: Entry to Early Career (Freshers welcome!) Shift Options: Domestic | Middle East | International Working Days: 5 days/week (Wednesday & Thursday off) | Weekend availability required Target Joiners: Any (Bachelor’s or Master’s) 🔥 What You'll Be Owning (Your Impact): Lead Activation: Engage daily with high-intent leads through dynamic channels—calls, video consults, and more. Sales Funnel Pro: Own the full sales journey—from first hello to successful enrollment. Consultative Selling: Host personalized video consultations with parents/adult learners, pitch trial sessions, and resolve concerns with clarity and empathy. Target Slayer: Consistently crush weekly revenue goals and contribute directly to Planet Spark’s growth engine. Client Success: Ensure a smooth onboarding experience and transition for every new learner. Upskill Mindset: Participate in hands-on training, mentorship, and feedback loops to constantly refine your game. 💡 Why Join Sales at Planet Spark? Only Warm Leads: Skip the cold calls—our leads already know us and have completed a demo session. High-Performance Culture: Be part of a fast-paced, energetic team that celebrates success and rewards hustle. Career Fast-Track: Unlock rapid promotions, performance bonuses, and leadership paths. Top-Notch Training: Experience immersive onboarding, live role-plays, and access to ongoing L&D programs. Rewards & Recognition: Weekly shoutouts, cash bonuses, and exclusive events to celebrate your wins. Make Real Impact: Help shape the minds of tomorrow while building a powerhouse career today. 🎯 What You Bring to the Table: Communication Powerhouse: You can build trust and articulate ideas clearly in both spoken and written formats. Sales-Driven: You know how to influence decisions, navigate objections, and close deals with confidence. Empathy First: You genuinely care about clients’ goals and tailor your approach to meet them. Goal-Oriented: You’re self-driven, proactive, and hungry for results. Tech Fluent: Comfortable using CRMs, video platforms, and productivity tools. ✨ What’s in It for You? 💼 High-growth sales career with serious earning potential 🌱 Continuous upskilling in EdTech, sales, and communication 🧘 Supportive culture that values growth and well-being 🎯 Opportunity to work at the cutting edge of education innovation

Posted 2 days ago

Apply

10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Title: Lead Platform Engineer – AWS Data Platform Location: Hybrid – Hyderabad, Telangana Experience: 10+ years Employment Type: Full-Time Apply Now --- About the Role Infoslab is hiring on behalf of our client, a leading healthcare technology company committed to transforming healthcare through data. We are seeking a Lead Platform Engineer to architect, implement, and lead the development of a secure, scalable, and cloud-native data platform on AWS. This role combines deep technical expertise with leadership responsibilities. You will build the foundation that supports critical business intelligence, analytics, and machine learning applications across the organization. --- Key Responsibilities Architect and build a highly available, cloud-native data platform using AWS services such as S3, Glue, Redshift, Lambda, and ECS. Design reusable platform components and frameworks to support data engineering, analytics, and ML pipelines. Build and maintain CI/CD pipelines, GitOps workflows, and infrastructure-as-code using Terraform. Drive observability, operational monitoring, and incident response processes across environments. Ensure platform security, compliance (HIPAA, SOC2), and audit-readiness in partnership with InfoSec. Lead and mentor a team of platform engineers, promoting best practices in DevOps and cloud infrastructure. Collaborate with cross-functional teams to deliver reliable and scalable data platform capabilities. --- Required Skills and Experience 10+ years of experience in platform engineering, DevOps, or infrastructure roles with a data focus. 3+ years in technical leadership or platform engineering management. Deep experience with AWS services, including S3, Glue, Redshift, Lambda, ECS, and Athena. Strong hands-on experience with Python or Scala, and automation tooling. Proficient in Terraform and CI/CD tools (GitHub Actions, Jenkins, etc.). Advanced knowledge of Apache Spark for both batch and streaming workloads. Proven track record of building secure, scalable, and compliant infrastructure. Strong understanding of observability, reliability engineering, and infrastructure automation. --- Preferred Qualifications Experience with containerization and orchestration (Docker, Kubernetes). Familiarity with Data Mesh principles or domain-driven data platform design. Background in healthcare or other regulated industries. Experience integrating data platforms with BI tools like Tableau or Looker. --- Why Join Contribute to a mission-driven client transforming healthcare through intelligent data platforms. Lead high-impact platform initiatives that support diagnostics, research, and machine learning. Work with modern engineering practices including IaC, GitOps, and serverless architectures. Be part of a collaborative, hybrid work culture focused on innovation and technical excellence.

Posted 2 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies