Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
0.0 - 4.0 years
0 Lacs
karnataka
On-site
As a Sales Operations Intern at Lifesight, you will play a crucial role in supporting our Sales Ops team by assisting in data enrichment, cleaning, and maintenance across various tools. This position offers a valuable opportunity to gain hands-on experience in sales operations, CRM management, and data optimization within a dynamic SaaS startup environment. Your key responsibilities will include updating and verifying records in CRM and other sales tools to enhance data enrichment, as well as cleaning and standardizing data to ensure accuracy and usability. You will identify and eliminate duplicate, outdated, or incorrect data, while also supporting the Sales Ops team in maintaining and optimizing sales databases. Additionally, you will be involved in generating and analyzing reports to enhance data quality and collaborating with Sales, Marketing, and Operations teams to uphold data integrity. We are seeking individuals who are currently pursuing or have recently completed a degree in Business, Marketing, Data Analytics, or a related field. Attention to detail, proficiency in Excel/Google Sheets, and an analytical mindset with a problem-solving approach are essential qualities for this role. Experience with CRM tools such as Salesforce, HubSpot, or similar platforms is advantageous. The ability to work independently, manage time effectively, and a keen interest in sales operations and SaaS business models are desirable traits we are looking for in potential candidates. In this role, you will gain practical experience in sales operations and data management within a SaaS startup environment. You will be exposed to CRM tools and best practices in data enrichment, working closely with experienced Sales Ops professionals. By contributing to improving sales processes and insights, you will have the opportunity to make a real impact. Based on performance, there is potential for a full-time opportunity within the company. If you are detail-oriented, eager to learn, and have a genuine interest in sales operations, we encourage you to apply and be part of our team at Lifesight.,
Posted 1 week ago
3.0 - 8.0 years
3 - 7 Lacs
hyderabad, telangana, india
On-site
Tech Stalwart Solution Private Limited is looking for Sr. Data Engineer to join our dynamic team and embark on a rewarding career journey. Responsibilities: Liaising with coworkers and clients to elucidate the requirements for each task. Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed. Reformulating existing frameworks to optimize their functioning. Testing such structures to ensure that they are fit for use. Preparing raw data for manipulation by data scientists. Detecting and correcting errors in your work. Ensuring that your work remains backed up and readily accessible to relevant coworkers. Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs.
Posted 1 week ago
3.0 - 8.0 years
6 - 7 Lacs
chennai, tamil nadu, india
On-site
Tech Stalwart Solution Private Limited is looking for Data Engineering to join our dynamic team and embark on a rewarding career journey. Responsibilities : Liaising with coworkers and clients to elucidate the requirements for each task. Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed. Reformulating existing frameworks to optimize their functioning. Testing such structures to ensure that they are fit for use. Preparing raw data for manipulation by data scientists. Detecting and correcting errors in your work. Ensuring that your work remains backed up and readily accessible to relevant coworkers. Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs.
Posted 1 week ago
6.0 - 12.0 years
0 Lacs
hyderabad, telangana
On-site
The Precision Medicine technology team at Amgen is dedicated to developing Data Searching, Cohort Building, and Knowledge Management tools to provide visibility to Amgen's extensive human datasets, projects, study histories, and scientific findings. The team focuses on managing multiomics data, clinical study subject measurements, images, and specimen inventory data. The PMED capabilities play a crucial role in Amgen's mission to accelerate discovery and speed to market of advanced precision medications. As a Solution and Data Architect, you will be responsible for designing an enterprise analytics and data mastering solution using Databricks and Power BI. This role demands expertise in data architecture and analytics to create scalable, reliable, and high-performing solutions for research cohort-building and advanced research pipelines. The ideal candidate will have experience in creating unified repositories of human data from multiple sources. Collaboration with stakeholders from data engineering, business intelligence, and IT teams is essential to design and implement data models, integrate data sources, and ensure data governance and security best practices. The role requires a strong background in data warehousing, ETL, Databricks, Power BI, and enterprise data mastering. Key Responsibilities: - Architect scalable enterprise analytics solutions using Databricks, Power BI, and modern data tools. - Utilize data virtualization, ETL, and semantic layers to balance unification, performance, and data transformation. - Support development planning by aligning features with architectural direction. - Participate in pilots and proofs-of-concept for new patterns. - Document architectural direction, patterns, and standards. - Train engineers and collaborators on architecture strategy and patterns. - Collaborate with data engineers to build and optimize ETL pipelines. - Design robust data models and processing layers for analytical processing and reporting. - Implement data governance, security, and compliance best practices. - Integrate data systems with enterprise applications for seamless data flow. - Provide thought leadership on data architecture and advanced analytics. - Develop and maintain optimized Power BI solutions. - Serve as a subject matter expert on Power BI and Databricks. - Define data requirements, architecture specifications, and project goals. - Evaluate and adopt new technologies to enhance data solutions. Basic Qualifications and Experience: - Master's degree with 6-8 years of experience OR Bachelor's degree with 8-10 years of experience OR Diploma with 10-12 years of experience in data management and solution architecture. Functional Skills: Must-Have Skills: - Hands-on experience with BI solutions and CDC ETL pipelines. - Expertise in Power BI, Databricks, and cloud platforms. - Strong communication and collaboration skills. - Ability to assess business needs and align solutions. Good-to-Have Skills: - Experience with human healthcare data and clinical trial data management. Professional Certifications: - ITIL Foundation or relevant certifications (preferred). - SAFe Agile Practitioner. - Microsoft Certified: Data Analyst Associate or related certification. - Databricks Certified Professional or similar certification. Soft Skills: - Excellent analytical and troubleshooting skills. - Intellectual curiosity and self-motivation. - Strong communication skills. - Confidence as a technical leader. - Ability to work effectively in global, virtual teams. - Strong problem-solving and analytical skills.,
Posted 2 weeks ago
7.0 - 12.0 years
12 - 16 Lacs
gurugram
Work from Office
Job Title: Senior Manager Data Analytics (Tableau & Database Management) Location: Gurgaon, India About the Role We are seeking a highly skilled Senior Manager Data Analytics with strong expertise in Tableau, database management, data optimization, and dashboarding . The ideal candidate will have 5+ years of hands-on experience , excellent problem-solving skills, and a strong mathematical background. This role requires both technical proficiency and team management experience , as the person will lead a team of analysts to drive data insights and business impact. Key Responsibilities Lead the end-to-end development of interactive Tableau dashboards and data visualization solutions. Manage and optimize databases, data flows, and ETL processes to ensure data accuracy, performance, and scalability. Partner with business stakeholders to gather requirements, design KPIs, and deliver actionable insights. Ensure data quality, governance, and optimization across all reporting layers. Provide leadership and mentorship to a team of analysts, fostering a culture of continuous learning and collaboration. Solve complex business problems using data-driven approaches and advanced mathematical/statistical techniques. Drive process improvements by identifying gaps, optimizing workflows, and implementing automation where possible. Required Skills & Experience 5+ years of experience in data analytics with strong expertise in Tableau (dashboarding, visualization, calculations, advanced charts) with Overall min 7 Years of experience. Proven experience in managing large databases, SQL queries, ETL pipelines, and data optimization . Strong background in mathematics, statistics, and analytical problem solving . Prior team management experience with the ability to lead, coach, and motivate analysts. Hands-on experience with data modeling, performance tuning, and automation . Excellent communication skills with the ability to translate data insights into business impact. Good to Have Experience with Python/R for analytics and automation. Contact Centre reporting Knowledge of cloud platforms (AWS, GCP, or Azure) for data management. Exposure to BI tools apart from Tableau (Power BI, QlikView, etc.). Education Bachelors or masters degree in mathematics, Statistics, Computer Science, Engineering, or related field . Contact Person HR Supriya- 9289327281
Posted 2 weeks ago
10.0 - 12.0 years
0 Lacs
gurugram, haryana, india
On-site
Job Description As a Senior Component Engineer, You will be responsible for the strategic management of mechanical components, ensuring their standardization, reuse, and data integrity throughout their lifecycle. This role requires a unique blend of deep mechanical component expertise, robust data architecture knowledge for Product Data Management (PDM) systems like SAP and Teamcenter, and strong project management skills to lead digital initiatives. You will play a critical role in defining and implementing data models and workflows that support the efficient creation, management, and retrieval of product data. Your ability to lead cross-functional teams and drive digital transformation will be essential in optimizing our product data management strategy and enhancing our overall product development processes. Essential Duties & Responsibilities Serve as the Data Architect for mechanical component data within our PDM/PLM systems (e.g., SAP, Teamcenter), designing, developing, and maintaining robust data models, schemas, and attributes for mechanical components. Ensure data quality, consistency, and traceability across the product lifecycle, and define and implement data governance policies and procedures for mechanical component data. Collaborate with IT and other engineering teams to integrate component data seamlessly across various enterprise systems (ERP, CAD, etc.). Identify opportunities for data optimization, automation, and digital solutions to improve data access, analysis, and utilization for mechanical components. Oversee data migration and synchronization efforts related to mechanical component information. Lead the development and execution of Component Reuse Strategies for mechanical commodities to consolidate part usage and reduce complexity. Manage supply chain and environmental risks, cost, lead time, and site-specific requirements, ensuring accurate and timely updates to component rankings and status. Oversee the tactical execution of all mechanical commodity-related activities, including Part Qualification, Supplier Corrective Action, Alternate Sourcing, and Change Management. Provide technical expertise and guidance on mechanical component design, analysis, and testing to engineering teams. Lead and manage digital solution projects related to product data management and lifecycle management, from concept to implementation. Define project scope, objectives, deliverables, and success criteria in collaboration with stakeholders across engineering, supply chain, and IT. Develop detailed project plans, timelines, resource allocation, and budget management for digital transformation initiatives. Implement and champion project management methodologies (e.g., Agile, Scrum) to ensure efficient and effective project execution. Foster a culture of continuous improvement, identifying opportunities to leverage digital technologies for enhanced component engineering processes. Possess a strong understanding of software development principles to effectively collaborate with development teams on digital solutions and understand technical constraints and possibilities. Qualifications Required Experience & Education Bachelor&aposs Degree in Engineering, (Manufacturing, Mechanical, Electrical, Industrial) or equivalent; Minimum of Ten (10) years of directly relevant experience required. Practical experience in capturing value for commodities such as sheet metal, metal machining, plastic machining, electromechanical and hardware. Thorough understanding of Mechanical commodity, manufacturing, materials & processes required. Ability to read and interpret product BOM&aposs and drawings Solid experience in project management, leading complex initiatives from inception to completion, preferably in a digital transformation context. Familiarity with software development methodologies (e.g., Agile, Scrum) and a good understanding of the software development lifecycle. Strong analytical and problem-solving skills, with the ability to translate business requirements into technical solutions. Ability to demonstrate problem solving and critical thinking skills. Strong knowledge Six Sigma and Lean Principles. Additional Details Excellent communication and influencing skills required. This job has a full time weekly schedule. Our pay ranges are determined by role, level, and location. Within the range, individual pay is determined by work location and additional factors, including job-related skills, experience, and relevant education or training. During the hiring process, a recruiter can share more about the specific pay range for a preferred location. Pay and benefit information by country are available at: https://careers.agilent.com/locations Agilent Technologies Inc. is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability or any other protected categories under all applicable laws. Travel Required: Occasional Shift: Day Duration: No End Date Job Function: Manufacturing Show more Show less
Posted 3 weeks ago
2.0 - 6.0 years
0 Lacs
delhi
On-site
The Supplier Optimization Manager position at TravelBullz requires a highly analytical and commercially minded individual to lead API-driven supplier performance and commercial efficiency strategies. Your role will involve vendor analysis, real-time data optimization, and destination performance mapping to ensure TravelBullz maximizes the value from every API connection. Your key responsibilities will include monitoring and optimizing supplier performance through real-time data analysis, focusing on Look-to-Book ratios, availability health, conversion rates, and error mapping. You will also be responsible for conducting API buying optimization by reviewing competitive pricing, content quality, and connectivity health. Furthermore, you will manage and continuously refine supplier mapping to ensure that the best-performing suppliers serve the highest-demand routes and destinations. Additionally, you will analyze best-selling destinations and inventory gaps to reallocate traffic and improve coverage. Collaboration with product, tech, and commercial teams is essential to implement supplier rule engines, fallback logic, and prioritization strategies. You will also lead supplier QBRs with actionable insights based on KPIs like fill-rate, uptime, and quote speed, and support negotiation of commercial terms aligned with performance metrics and strategic goals. To qualify for this role, you should have a Bachelor's degree in Business, Supply Chain, Data Analytics, or a related field (Masters preferred) and at least 2+ years of experience in travel tech, OTA, wholesaler, or API-based B2B optimization. A strong understanding of supplier API structures, cache/feed performance, and commercial logic is required. Experience in building or optimizing supplier distribution matrices, destination-supplier matching, or fallback strategies is also beneficial. Proficiency in Excel, BI tools (e.g., Power BI, Elastic), and fluency in English are essential. Experience with rate caching, deduplication engines, or travel data platforms is a bonus. In return, TravelBullz offers a competitive salary with performance bonus, medical benefits, flexible working hours, regional exposure, cross-functional collaboration, and fast-track leadership potential in a growing digital enterprise. If you are interested in this opportunity, please share your resumes at hr@travelbullz.com with the subject line "Supplier Optimization Manager.",
Posted 1 month ago
10.0 - 14.0 years
0 Lacs
maharashtra
On-site
You will be heading the Data Analytics and AI team within the Enterprise Technology Group with a primary purpose of leading and developing a robust team focused on driving the company towards a data-first and data-driven organization. Your role will involve delivering and maintaining smart data-driven solutions essential for growth and operational efficiency. Accelerating the adoption and implementation of AI solutions to enhance innovation and competitive advantage will be a key aspect. Managing all aspects of new data analytics and AI projects from inception to delivery, including day-to-day operations, will be crucial, with a strong emphasis on assessing business value, ROI, and cost reduction. The ideal candidate should hold a Bachelor's or Master's degree in Computer Science, Math, Quantitative Methods, or Information Systems, with over 10 years of overall technology experience, including substantial experience in data analytics and AI. Proficiency in programming languages such as Python, R, or similar, as well as a deep understanding of algorithms like linear regression, logistic regression, decision trees, random forest, boosting algorithms (e.g., XGBoost), k-means, hierarchical clustering, and principal component analysis is preferred. Familiarity with data science concepts, statistical and mathematical methods, and various Data & AI technologies/tools is essential. Located in Mumbai, this is a full-time office-based position within the Enterprise Technology Group, reporting to the Head of the same group. The key responsibilities will include leading and developing a strong data analytics and AI team, establishing best practices for data analytics and responsible AI initiatives, managing all stages of AI and LLM projects, collaborating with stakeholders to gather requirements, and implementing advanced models and algorithms to solve complex business problems. Staying updated with the latest advancements in AI and LLM technologies will also be a critical aspect of the role. Key behavioral strengths required for this position include excellent analytical, problem-solving, and decision-making skills, proven leadership abilities, effective communication and collaboration skills, and the capacity to work with diverse stakeholders. Joining CulverMax Entertainment Pvt Ltd will offer you the opportunity to work with some of India's leading entertainment channels and OTT platforms, alongside a progressive and inclusive work culture that celebrates diversity and innovation. As a part of the team, you will contribute to creating exceptional content and experiences while being recognized as an Employer of Choice in various prestigious awards and accolades.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
ahmedabad, gujarat
On-site
You are a creative and data-driven Content Strategist responsible for leading the planning, development, and execution of content across various platforms. Your role involves aligning content with brand goals, managing content calendars, and ensuring consistency in tone and messaging to drive engagement, reach, and conversions. Your key responsibilities will include developing and managing a comprehensive content strategy for web, social media, blogs, email, and other digital channels. This will involve conducting audience research, competitor analysis, and SEO audits to inform content planning. Collaboration with design, marketing, and product teams is essential to deliver cohesive messaging. You will also create and manage content calendars that align with campaigns, launches, and business goals. As a Content Strategist, you will supervise writers, freelancers, and content creators to ensure content quality and consistency. Tracking performance using tools like Google Analytics, SEMrush, or HubSpot will be crucial, enabling you to optimize content based on data-driven insights. Additionally, staying updated on industry trends and emerging content formats is necessary to enhance content strategy effectiveness. If you are interested in this role, please share your CV at info@xcelhrsolutions.com.,
Posted 1 month ago
4.0 - 8.0 years
0 Lacs
hyderabad, telangana
On-site
You should have at least 4 years of experience in Power BI and Tableau, specializing in data modeling for reporting and data warehouses. Your expertise should include a thorough understanding of the BI process and excellent communication skills. Your responsibilities will involve Tableau Server Management, including installation, configuration, and administration to maintain consistent availability and performance. You must showcase your ability to design, develop, and optimize data models for large datasets and complex reporting requirements. Strong analytical and debugging skills are essential to identify, analyze, and resolve issues within Power BI reports, SQL code, and data for accurate and efficient performance. Proficiency in DAX and Power Query, along with advanced knowledge of data modeling concepts, is required. Additionally, you should possess strong SQL skills for querying, troubleshooting, and data manipulation. Security implementation is crucial, as you will be responsible for managing user permissions and roles to ensure data security and compliance with organizational policies. A good understanding of ETL processes and in-depth knowledge of Power BI Service, Tableau Server, and Desktop are expected. Your familiarity with workspaces, datasets, dataflows, and security configurations will be beneficial in fulfilling your role effectively.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
maharashtra
On-site
As the leader of the data analytics and AI team, you will play a crucial role in driving our organization towards a data-first and data-driven approach. Your primary responsibility will be to deliver and maintain smart data-driven solutions that are essential for our growth and daily operations. By accelerating the adoption of AI solutions, you will enhance efficiency, foster innovation, and gain a competitive advantage for the company. Your role will involve overseeing all aspects of new data and AI projects, from their inception to delivery, with a strong emphasis on assessing business value, ROI, and cost reduction. Your key responsibilities will include leading and developing a robust data analytics and AI team, establishing best practices and methodologies for data analytics and responsible AI initiatives, managing and overseeing AI and LLM projects, building relationships with business process owners, collaborating with internal and external stakeholders, developing advanced models and algorithms, addressing data mining performance issues, and staying updated on the latest advancements in AI and LLM technologies to improve project outcomes. Preferred skills for this role include proficiency in programming languages such as Python, R, or similar, understanding of algorithms like linear regression, logistic regression, decision trees, random forest, boosting algorithms (e.g., XGBoost), clustering techniques, and principal component analysis. Additionally, familiarity with data science concepts including statistics, probability, data transformation, visualization, storytelling, and deep learning, as well as expertise in statistical and mathematical methods and data & AI technologies/tools like ML Ops, ETL, data lakes, graph databases, NLK, and data optimization will be beneficial for success in this position.,
Posted 1 month ago
2.0 - 6.0 years
1 - 2 Lacs
Chennai
Work from Office
Skilled in tool selection,insert grades, and cutting data optimization Strong in G- code,M-Code & CNC programming Proficient in 2 axis to multi axis and sliding head CNC lathes Knowledge in lean, Kaizen & 5s practices Create programs(Fanuc, Siemens)
Posted 2 months ago
7.0 - 11.0 years
0 Lacs
haryana
On-site
As a Data Engineering Manager at GoKwik, you will have the exciting opportunity to lead a team of data engineers and collaborate closely with product managers, data scientists, business intelligence teams, and SDEs to design and implement data-driven strategies across the organization. You will be responsible for designing the overall data architecture that drives valuable insights for the company. Your key responsibilities will include leading and guiding the data engineering team in developing optimal data strategies according to business needs, identifying and implementing process improvements to enhance data models, architectures, pipelines, and applications, ensuring data optimization processes, managing data governance, security, and analysis, as well as hiring and mentoring top talent within the team. Additionally, you will play a crucial role in managing data delivery through high-performing dashboards, visualizations, and reports, ensuring data quality and security across various product verticals, designing and launching new data models and pipelines, acting as a project manager for data projects, and fostering a data-driven culture within the team. To excel in this role, you should possess a Bachelor's/Master's degree in Computer Science, Mathematics, or relevant field, along with at least 7 years of experience in Data Engineering. Strong project management skills, proficiency in SQL and relational databases, experience in building data pipelines and architectures, familiarity with data transformation processes, and working knowledge of AWS cloud services are essential requirements for this role. We are seeking individuals who are independent, resourceful, analytical, and adept at problem-solving, with the ability to thrive in a fast-paced and dynamic environment. Excellent communication skills, both verbal and written, are crucial for effective collaboration within cross-functional teams. If you are looking to be part of a high-growth startup that values innovation, talent, and customer-centricity, and if you are passionate about tackling challenging problems and making a significant impact within an entrepreneurial setting, we invite you to join our team at GoKwik!,
Posted 2 months ago
2.0 - 6.0 years
0 Lacs
kolkata, west bengal
On-site
You are a Data Engineer with 2 to 4 years of experience in Python and PL/SQL. Your primary responsibility is to design, develop, and maintain data pipelines, ETL processes, and database solutions. You will be working on ETL Development & Data Processing, where you will develop, optimize, and maintain ETL pipelines for data ingestion, transformation, and integration. You will handle structured and semi-structured data from various sources and implement data cleansing, validation, and enrichment processes using Python and PL/SQL. In Database Development & Optimization, you will write, debug, and optimize complex SQL queries, stored procedures, functions, and triggers in PL/SQL. Additionally, you will design and maintain database schemas, indexing strategies, and partitioning for performance optimization, ensuring data consistency, quality, and governance across all data sources. Your role also involves Data Engineering & Automation, where you will automate data workflows using Python scripts and scheduling tools like Airflow, Cron, or DBMS_JOB. You will optimize query performance, troubleshoot database-related performance issues, and monitor data pipelines for failures while implementing alerting mechanisms. Collaboration & Documentation are crucial aspects of your job. You will closely collaborate with Data Analysts, Architects, and Business teams to understand data requirements. Documenting ETL processes, database schemas, and data flow diagrams will be part of your responsibilities. You will also participate in code reviews, testing, and performance tuning activities. Your Technical Skills should include strong experience in Python for data processing (Pandas, NumPy, PySpark), expertise in PL/SQL, hands-on experience with ETL tools, and knowledge of relational and non-relational databases. Exposure to Cloud & Big Data technologies like AWS/GCP/Azure, Spark, or Snowflake will be advantageous. Soft Skills such as problem-solving, effective communication, teamwork, and ability to manage tasks independently are essential for this role. This is a Full-time, Permanent position with a Day shift schedule and an in-person work location.,
Posted 2 months ago
4.0 - 8.0 years
0 Lacs
noida, uttar pradesh
On-site
The ideal candidate for the Azure Data Engineer position will have 4-6 years of experience in designing, implementing, and maintaining data solutions on Microsoft Azure. As an Azure Data Engineer at our organization, you will be responsible for designing efficient data architecture diagrams, developing and maintaining data models, and ensuring data integrity, quality, and security. You will also work on data processing, data integration, and building data pipelines to support various business needs. Your role will involve collaborating with product and project leaders to translate data needs into actionable projects, providing technical expertise on data warehousing and data modeling, as well as mentoring other developers to ensure compliance with company policies and best practices. You will be expected to maintain documentation, contribute to the company's knowledge database, and actively participate in team collaboration and problem-solving activities. We are looking for a candidate with a Bachelor's degree in Computer Science, Information Technology, or a related field, along with proven experience as a Data Engineer focusing on Microsoft Azure. Proficiency in SQL and experience with Azure data services such as Azure Data Factory, Azure SQL Database, Azure Databricks, and Azure Synapse Analytics is required. Strong understanding of data architecture, data modeling, data integration, ETL/ELT processes, and data security standards is essential. Excellent problem-solving, collaboration, and communication skills are also important for this role. As part of our team, you will have the opportunity to work on exciting projects across various industries like High-Tech, communication, media, healthcare, retail, and telecom. We offer a collaborative environment where you can expand your skills by working with a diverse team of talented individuals. GlobalLogic prioritizes work-life balance and provides professional development opportunities, excellent benefits, and fun perks for its employees. Join us at GlobalLogic, a leader in digital engineering, where we help brands design and build innovative products, platforms, and digital experiences for the modern world. Headquartered in Silicon Valley, GlobalLogic operates design studios and engineering centers worldwide, serving customers in industries such as automotive, communications, financial services, healthcare, manufacturing, media and entertainment, semiconductor, and technology.,
Posted 2 months ago
3.0 - 5.0 years
4 - 4 Lacs
Ahmedabad
Work from Office
Roles and Responsibilities: Supervise and guide a team of Lead Generation, Lead Acquisition, and Customer Experience. Define and track KPIs, KRAs, daily productivity, and conversion metrics. Oversee lead allocation, validation, deal pipeline movement, demo performance, and coordination with field sales teams. Monitor demos, closures, and ensure timely follow-ups on expiring subscriptions Implement and standardize processes across teams as per SOPs. Identify and resolve gaps in lead qualification, feedback, and follow-up strategies and work cross-functionally to fix them. Optimize usage of CRM and reporting tools, ensuring real-time data accuracy. Manage escalation flow from Customer Experience Executives and resolve high-priority client issues. Prepare weekly and monthly performance dashboards across teams (leads shared, demos arranged, closures, conversion %). Analyze feedback, churn, and deal-loss reasons to suggest actionable insights. Maintain detailed logs for calls, demos, meetings, and feedback quality. Collaborate with the Sales Operations Manager to execute growth initiatives.. Drive a performance-based culture with recognition, feedback, and consistent communication.
Posted 2 months ago
4.0 - 7.0 years
20 - 35 Lacs
Chennai
Remote
Role & responsibilities: Architect, deploy, and manage scalable cloud environments (AWS/GCP/DO) to support distributed data processing solutions to handle terabyte-scale datasets and billions of records efficiently Automate infrastructure provisioning, monitoring, and disaster recovery using tools like Terraform, Kubernetes, and Prometheus. Optimize CI/CD pipelines to ensure seamless deployment of web scraping workflows and infrastructure updates. Develop and maintain stealthy web scrapers using Puppeteer, Playwright, and headless chromium browsers. Reverse-engineer bot-detection mechanisms (e.g., TLS fingerprinting, CAPTCHA solving) and implement evasion strategies. Monitor system health, troubleshoot bottlenecks, and ensure 99.99% uptime for data collection and processing pipelines. Implement security best practices for cloud infrastructure, including intrusion detection, data encryption, and compliance audits. Partner with data collection, ML and SaaS teams to align infrastructure scalability with evolving data needs Preferred candidate profile : 4-7 years of experience in site reliability engineering and cloud infrastructure management Proficiency in Python, JavaScript for scripting and automation . Hands-on experience with Puppeteer/Playwright, headless browsers, and anti-bot evasion techniques . Knowledge of networking protocols, TLS fingerprinting, and CAPTCHA-solving frameworks . Experience with monitoring and observability tools such as Grafana, Prometheus, Elasticsearch, and familiarity with monitoring and optimizing resource utilization in distributed systems. Experience with data lake architectures and optimizing storage using formats such as Parquet, Avro, or ORC. Strong proficiency in cloud platforms (AWS, GCP, or Azure) and containerization/orchestration (Docker, Kubernetes). Deep understanding of infrastructure-as-code tools (Terraform, Ansible) . Deep experience in designing resilient data systems with a focus on fault tolerance, data replication, and disaster recovery strategies in distributed environments. Experience implementing observability frameworks, distributed tracing, and real-time monitoring tools. Excellent problem-solving abilities, with a collaborative mindset and strong communication skills.
Posted 2 months ago
4.0 - 7.0 years
20 - 35 Lacs
Chennai
Remote
Role & responsibilities: Architect, deploy, and manage scalable cloud environments (AWS/GCP/DO) to support distributed data processing solutions to handle terabyte-scale datasets and billions of records efficiently Automate infrastructure provisioning, monitoring, and disaster recovery using tools like Terraform, Kubernetes, and Prometheus. Optimize CI/CD pipelines to ensure seamless deployment of web scraping workflows and infrastructure updates. Develop and maintain stealthy web scrapers using Puppeteer, Playwright, and headless chromium browsers. Reverse-engineer bot-detection mechanisms (e.g., TLS fingerprinting, CAPTCHA solving) and implement evasion strategies. Monitor system health, troubleshoot bottlenecks, and ensure 99.99% uptime for data collection and processing pipelines. Implement security best practices for cloud infrastructure, including intrusion detection, data encryption, and compliance audits. Partner with data collection, ML and SaaS teams to align infrastructure scalability with evolving data needs Preferred candidate profile : 4-7 years of experience in site reliability engineering and cloud infrastructure management Proficiency in Python, JavaScript for scripting and automation . Hands-on experience with Puppeteer/Playwright, headless browsers, and anti-bot evasion techniques . Knowledge of networking protocols, TLS fingerprinting, and CAPTCHA-solving frameworks . Experience with monitoring and observability tools such as Grafana, Prometheus, Elasticsearch, and familiarity with monitoring and optimizing resource utilization in distributed systems. Experience with data lake architectures and optimizing storage using formats such as Parquet, Avro, or ORC. Strong proficiency in cloud platforms (AWS, GCP, or Azure) and containerization/orchestration (Docker, Kubernetes). Deep understanding of infrastructure-as-code tools (Terraform, Ansible) . Deep experience in designing resilient data systems with a focus on fault tolerance, data replication, and disaster recovery strategies in distributed environments. Experience implementing observability frameworks, distributed tracing, and real-time monitoring tools. Excellent problem-solving abilities, with a collaborative mindset and strong communication skills.
Posted 3 months ago
3 - 7 years
7 - 10 Lacs
Bengaluru
Remote
• Design and implement scalable, efficient and high-performance data pipelines • Develop and optimize ETL/ELT workflows using modern tools and frameworks. • Work with cloud platforms (AWS, Azure, GCP) Detailed JD will be given later.
Posted 4 months ago
4.0 - 7.0 years
20 - 35 Lacs
chennai
Remote
Role & responsibilities: Architect, deploy, and manage scalable cloud environments (AWS/GCP/DO) to support distributed data processing solutions to handle terabyte-scale datasets and billions of records efficiently Automate infrastructure provisioning, monitoring, and disaster recovery using tools like Terraform, Kubernetes, and Prometheus. Optimize CI/CD pipelines to ensure seamless deployment of web scraping workflows and infrastructure updates. Develop and maintain stealthy web scrapers using Puppeteer, Playwright, and headless chromium browsers. Reverse-engineer bot-detection mechanisms (e.g., TLS fingerprinting, CAPTCHA solving) and implement evasion strategies. Monitor system health, troubleshoot bottlenecks, and ensure 99.99% uptime for data collection and processing pipelines. Implement security best practices for cloud infrastructure, including intrusion detection, data encryption, and compliance audits. Partner with data collection, ML and SaaS teams to align infrastructure scalability with evolving data needs Preferred candidate profile : 4-7 years of experience in site reliability engineering and cloud infrastructure management Proficiency in Python, JavaScript for scripting and automation . Hands-on experience with Puppeteer/Playwright, headless browsers, and anti-bot evasion techniques . Knowledge of networking protocols, TLS fingerprinting, and CAPTCHA-solving frameworks . Experience with monitoring and observability tools such as Grafana, Prometheus, Elasticsearch, and familiarity with monitoring and optimizing resource utilization in distributed systems. Experience with data lake architectures and optimizing storage using formats such as Parquet, Avro, or ORC. Strong proficiency in cloud platforms (AWS, GCP, or Azure) and containerization/orchestration (Docker, Kubernetes). Deep understanding of infrastructure-as-code tools (Terraform, Ansible) . Deep experience in designing resilient data systems with a focus on fault tolerance, data replication, and disaster recovery strategies in distributed environments. Experience implementing observability frameworks, distributed tracing, and real-time monitoring tools. Excellent problem-solving abilities, with a collaborative mindset and strong communication skills.
Posted Date not available
8.0 - 13.0 years
7 - 17 Lacs
hyderabad, pune, bengaluru
Work from Office
Proficient in the Braze marketing platform with expertise in data model design and campaign orchestration. Strong background in lifecycle programs, personalization, and marketing data optimization, including list cleansing.
Posted Date not available
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |