Job Title: Senior Epicor ERP Consultant Location: Remote Job Type: Full-Time Required: 7+ Years in Epicor ERP Company: BitsAtom Technology Apply at: sonal@bitsatom.com Job Summary: We are seeking a highly experienced Epicor ERP professional with at least 7 years of hands-on experience in implementing, customizing, and supporting Epicor ERP systems. The ideal candidate will be well-versed in Epicor modules such as Finance, Manufacturing, Supply Chain, and CRM, and will play a key role in optimizing our business processes through effective ERP solutions. Key Responsibilities: Lead or support full-cycle Epicor ERP implementations, upgrades, and migrations. Customize Epicor ERP using BPMs, BAQs, Dashboards, and C#/.NET tools. Gather and analyze business requirements to design and deliver ERP solutions. Collaborate with cross-functional teams to enhance workflows and business processes. Provide ongoing technical and functional support for users across departments. Develop custom reports using SSRS or Crystal Reports. Manage system integrations with third-party tools and services. Maintain system documentation, change logs, and user manuals. Ensure data accuracy, integrity, and security across Epicor systems. Train users on new functionalities, best practices, and process changes. Required Skills & Qualifications: Minimum 7+ years of experience working with Epicor ERP (v10 or higher). Strong understanding of Epicor modules: Financials, Manufacturing, Supply Chain, Inventory, CRM, etc. Experience with BPMs, BAQs, SSRS, Dashboards, and Epicor Admin tools. Proficient in C#, .NET, SQL Server, and T-SQL scripting. Experience in data migration, EDI integration, and system customization. Excellent analytical, problem-solving, and communication skills. Ability to work independently and manage multiple projects. Preferred Qualifications: Epicor Certification(s) in relevant modules. Experience with Epicor Kinetic is a strong plus. Familiarity with Power BI, REST APIs, or other reporting/BI tools. Background in manufacturing or distribution industries preferred. Why Join Us? Work with a team of experienced ERP professionals. Opportunity to lead impactful ERP initiatives. Remote flexibility and growth-focused environment. Competitive compensation and benefits.
🎬 We're Hiring! Join Our Creative Team! 🚀 Are you a creative all-rounder with a passion for video editing, visual effects, and graphic design? We are looking for a Video & VFX Editor + Graphic Designer + General Designer to join our growing team. If you have the skills to create stunning videos, design engaging graphics, and bring ideas to life across various platforms, this role is for you! Role: Video & Motion Graphic Editor Responsibilities: Video & VFX Editing: Edit and enhance high-quality video content, adding special effects, animations, and motion graphics Work with raw footage to create captivating videos, including color correction and sound editing Add dynamic visual effects to create professional-grade content Graphic Design: Design eye-catching visuals for digital platforms (websites, social media, ads) Create logos, banners, infographics, and other digital assets to support brand identity Develop creative marketing materials such as flyers, brochures, and presentations Key Requirements: Expertise in Adobe Creative Suite (Premiere Pro, After Effects, Photoshop, Illustrator, InDesign, etc.) Strong skills in video editing, VFX, motion graphics, and graphic design Experience in designing for both digital and print formats Excellent time management skills and the ability to work on multiple projects at once Creative problem-solving with a keen eye for detail Good communication skills and ability to collaborate with team members Additional Requirements: High-profile desktop setup will be provided by the company for remote work Work will be conducted remotely via TeamViewer, so comfort with remote desktop software is essential High-speed internet connection is required for smooth workflow and performance. Payment: ₹20 - ₹33 per hour (depending on experience) Shift: Preferably morning shift Location: Remote (Work from anywhere!) If you're excited to contribute your creativity across various media and be part of an innovative team, apply now! Send your portfolio and resume to aamxsolutions@gmail.com or DM for more details.
🎬 Motion Graphics Internship Opportunity – Join Our Creative Team! 🚀 Role: Video Editor + Motion Graphic Artist (with Graphic Design as an Added Advantage) Duration: 3-Month Internship → Opportunity for Full-Time Role Are you obsessed with motion design, VFX, and video editing ? We’re looking for a highly creative and technically skilled Motion Graphics Intern who can bring ideas to life using animation, visual effects, and sleek edits. If you're passionate about storytelling through motion—this internship is made for you! 🎯 Your Role: Animate scenes using text, icons, characters, and VFX Craft explainer videos, reels, ads, and content for brands Polish raw footage into smooth, visually striking final edits Collaborate with the creative team on concepts, storyboards, and design animations Graphic design will be an added advantage (not mandatory) 🎨 What You’ll Work On: Motion graphics for YouTube, Instagram, ads, and promos Visual storytelling with kinetic typography, transitions, and effects Light compositing, color grading, sound syncing Supporting graphic content like banners, thumbnails, and decks 🛠️ Tools You'll Use: Adobe After Effects (primary), Premiere Pro, Illustrator, Photoshop Access to Premium AI Tools to accelerate creativity ✅ Requirements: Solid understanding of motion design principles Prior experience or portfolio in After Effects projects Bonus if you know basic sound design Strong attention to detail, pacing, and design composition Must be able to work remotely via Remote Desktop (TeamViewer) High-speed internet is essential 📅 Internship Details: Stipend: ₹5,000 – ₹10,000 per month (based on skill & experience) Timings: Morning Shift | 10:00 AM – 6:30 PM Work Setup: Fully Remote (Company provides high-performance desktop remotely) Location: Work from anywhere 🌟 What You'll Get: Real-world experience working on high-impact creative projects Access to top-tier AI-powered creative tools Opportunity to earn a full-time role post internship based on performance A supportive and creative remote work culture 📩 Apply Now! Send your portfolio and resume to aamxsolutions@gmail.com or DM us for more details. Let your creativity move the world—literally! 🎞️✨
Key Responsibilities: • Design, develop, and optimize scalable data pipelines using Databricks, PySpark, and Delta Lake. • Work within a data mesh architecture to support decentralized data ownership and domain-oriented data products. • Integrate and manage event-driven data pipelines using Azure Event Hubs, Apache Kafka, or similar technologies. • Collaborate with Data Scientists and Analysts to support the ML lifecycle using tools like MLflow and Databricks Workflows. • Implement secure and governed data access with tools like Unity Catalog. • Design and maintain dashboards and reports using Power BI or other visualization tools. • Automate infrastructure deployment and management using Terraform or other Infrastructure-as-Code (IaC) solutions. • Ensure high data quality, reliability, and performance of all data systems. • Collaborate cross-functionally with product, engineering, and business teams to gather requirements and deliver solutions. Required Skills & Qualifications: • 4+ years of hands-on experience in data engineering roles. • Strong experience with Databricks, PySpark, and Delta Lake. • Understanding of data mesh principles and domain-driven data ownership. • Proficiency in working with event streaming platforms like Azure Event Hubs, Apache Kafka, or equivalents. • Familiarity with MLflow, Databricks Workflows, and Unity Catalog. • Experience using Power BI, Tableau, or other BI tools to create dashboards and reports. • Knowledge of Terraform, ARM Templates, or other IaC tools. • Strong SQL skills and experience working with large datasets and cloud data platforms (Azure preferred). • Excellent problem-solving and communication skills. Good to Have: • Experience with Azure Data Factory, Synapse Analytics, or Data Lake Storage. • Exposure to CI/CD pipelines in data workflows. • Knowledge of data governance, security, and compliance in cloud environments. Educational Qualification: • Bachelor’s or Master’s degree in Computer Science, Information Systems, Engineering, or a related field. Why Join Us? • Work on cutting-edge data solutions and next-gen cloud platforms. • Collaborate with experienced professionals in a fast-paced environment. • Opportunities for continuous learning and development. • Flexible work arrangements and competitive compensation.
Job Title: Senior Data Engineer (Azure + Databricks) Location: Remote / India Job Type: Full-Time Send Resume To: sonal@bitsatom.com Company: BitsAtom Technologies Experience Required: 10+ Years Job Summary: BitsAtom Technologies is seeking a Senior Data Engineer with extensive experience in Azure , Databricks , Python , and modern data pipeline architectures . You will play a key role in leading data initiatives, building scalable systems, and collaborating with cross-functional teams to drive analytics, reporting, and ML workloads. Key Responsibilities: Design, build, and maintain scalable ETL/ELT pipelines using Azure Data Factory , Databricks , PySpark , and SQL . Engineer data workflows integrating structured/unstructured data from ADLS Gen2 , Azure Synapse , SQL Server , and APIs. Optimize storage and compute performance in Databricks and Delta Lake environments. Implement data modeling and transformation logic aligned with reporting and machine learning needs. Lead end-to-end solutioning in Azure with a focus on performance, reliability, and scalability. Collaborate with data scientists, analysts, and stakeholders to deliver validated and production-ready datasets. Develop and maintain CI/CD pipelines using Azure DevOps or GitHub Actions . Ensure compliance with data governance, privacy, and security standards . Monitor and troubleshoot data pipelines, ensuring high availability and performance. Mentor junior engineers and provide technical leadership. Required Qualifications: Bachelor’s in Computer Science, Engineering, or related field (or equivalent work experience). 7+ years of experience in data engineering , including: Strong hands-on expertise in Databricks (Delta Lake, orchestration, notebooks). In-depth experience with Azure Cloud Services (ADF, ADLS Gen2, Synapse, Key Vault). High proficiency in Python , PySpark , and SQL . Experience building lakehouse architectures , and designing ETL/ELT pipelines . Excellent verbal and written communication skills . Ability to convey technical data concepts to non-technical stakeholders . Preferred Skills & Technologies: (Especially relevant for Databricks-focused projects) Experience with data mesh architecture and distributed data ownership . Familiarity with event-driven architectures : Azure Event Hubs, Kafka. Exposure to Databricks Workflows , MLflow , Unity Catalog . Hands-on experience with Power BI or similar BI/visualization tools. Knowledge of Terraform or other Infrastructure-as-Code (IaC) tools. Certifications (Preferred): Microsoft Certified: Azure Data Engineer Associate (DP-203) Databricks Certified Data Engineer 📩 To Apply: Send your resume to sonal@bitsatom.com
Job Title: Data Engineer Location: Remote Job Type: Full-Time Experience Required: 8+ Years Company: BitsAtom Technologies How to Apply : resourcing@bitsatom.com Job Summary: BitsAtom Technologies is seeking a skilled and experienced Data Engineer to join our growing team. The ideal candidate will bring 7+ years of experience in data engineering with strong expertise in Azure, SSIS, Databricks, Python, and data pipeline architecture. The role involves leading technical initiatives, building scalable and reliable data systems, and working closely with cross-functional teams to support business analytics and insights. Key Responsibilities: Design, build, and maintain scalable ETL/ELT pipelines using Azure Data Factory, Databricks, PySpark, and SQL. Engineer data workflows integrating structured and unstructured data from Azure Data Lake, Synapse, SQL Server, and external APIs. Optimize data storage and compute performance in Databricks and Delta Lake environments. Implement data modeling and transformation logic aligned with analytics, reporting, and machine learning requirements. Lead end-to-end data solutioning on Azure with a focus on performance and availability. Collaborate with data scientists, business analysts, and stakeholders to gather requirements and deliver validated datasets. Develop CI/CD pipelines using Azure DevOps or GitHub Actions. Ensure compliance with data governance, security, and regulatory standards. Monitor, troubleshoot, and resolve issues in data pipelines. Mentor junior engineers and provide technical leadership. Required Qualifications: Bachelor’s degree in Computer Science, Engineering, or a related field (or equivalent work experience). 7+ years of professional experience in data engineering. Deep expertise in Databricks (Delta Lake, orchestration, notebooks). Strong experience with Azure Cloud Services (ADF, ADLS Gen2, Key Vault). Proficiency in Python, PySpark, and SQL. Strong understanding of data lakehouses, ETL/ELT architecture, and orchestration. Excellent communication and collaboration skills. Ability to translate complex technical data concepts to business language. Preferred Qualifications: Experience with data mesh architecture. Familiarity with event-driven systems (Azure Event Hubs, Kafka). Exposure to MLflow, Unity Catalog, or Databricks Workflows. Experience with Power BI or similar BI tools. Familiarity with Terraform or Infrastructure-as-Code tools. Certifications (Preferred): Azure Data Engineer Associate (DP-203) Databricks Certified Data Engineer
Job Title: Data Engineer (Snowflake + Databricks) Location: Remote / India Job Type: Full-Time Company: BitsAtom Technologies Experience Required: 7+ Years Send Resume To: resourcing@bitsatom.com Job Summary BitsAtom Technologies is seeking a Senior Data Engineer with extensive experience in Azure, Databricks, Snowflake, Python, and modern data pipeline architectures. You will play a key role in leading enterprise-grade data initiatives, building scalable systems, and collaborating with cross-functional teams to drive analytics, reporting, and machine learning workloads. Key Responsibilities Design, develop, and maintain scalable ETL/ELT pipelines using Azure Data Factory, Databricks, PySpark, and SQL. Engineer data workflows integrating structured and unstructured data from Azure Data Lake (ADLS Gen2), Azure Synapse, SQL Server, and external APIs. Implement data modeling and transformation logic to support analytics, reporting, and ML workloads. Collaborate with data scientists and business analysts to understand requirements and deliver clean, reliable, production-ready datasets. Optimize data storage and compute performance in Azure Databricks and Delta Lake environments. Develop and maintain CI/CD pipelines for data workflows using Azure DevOps or GitHub Actions. Monitor, troubleshoot, and resolve pipeline performance issues to ensure reliability and high availability. Apply best practices in data governance, security, and compliance across workflows. Lead end-to-end data solutioning in Azure with a focus on performance, reliability, and scalability. Mentor junior engineers and provide technical leadership within the team. Required Qualifications Bachelor’s in Computer Science, Engineering, or related field (or equivalent work experience). 7+ years of experience in Data Engineering , including: Strong hands-on expertise in Databricks (Delta Lake, orchestration, notebooks). In-depth experience with Azure Cloud Services (ADF, ADLS Gen2, Synapse, Key Vault). High proficiency in Python, PySpark, and SQL . Experience building lakehouse architectures and designing ETL/ELT pipelines . Strong background in data modeling and transformation logic . Excellent communication skills with the ability to convey technical concepts to non-technical stakeholders. Preferred Skills & Technologies Experience with Snowflake for scalable data warehousing. Experience with data mesh architectures and distributed data ownership. Familiarity with event-driven architectures : Azure Event Hubs, Kafka. Exposure to Databricks Workflows, MLflow, Unity Catalog . Hands-on experience with Power BI or other BI/visualization tools. Knowledge of Terraform or other Infrastructure-as-Code (IaC) tools. Certifications (Preferred) Microsoft Certified: Azure Data Engineer Associate (DP-203) Databricks Certified Data Engineer 📩 To Apply: Send your resume to resourcing@bitsatom.com
Playwright JavaScript Engineer with 7-8 years of experience in automation testing and test framework development. We are seeking someone who can deliver high-quality testing solutions using Playwright, JavaScript/TypeScript, and API testing tools to ensure robust and reliable software delivery. Key Responsibilities : • Design, develop, and maintain automated test frameworks using Playwright with JavaScript/TypeScript. • Collaborate with developers, QA, and DevOps teams to ensure high-quality releases, including API automation testing using REST Assured, Postman, or Newman. • Write, execute, and maintain automated test scripts for web applications and APIs. • Perform functional, regression, and integration testing through automation suites. • Debug, troubleshoot, and optimize test scripts for performance and reliability. • Integrate automated tests into CI/CD pipelines (Jenkins, GitHub Actions, Azure DevOps, etc.). • Participate in code reviews and drive best practices in test automation. • Document test processes, frameworks, and results. Required Skills : • 4+ years of experience in software test automation with strong expertise in Playwright. • Proficiency in JavaScript/TypeScript. • Hands-on experience building scalable automation frameworks. • Experience with CI/CD tools (Jenkins, GitHub Actions, Azure DevOps). • Strong knowledge of Git and version control practices. • Good problem-solving and debugging skills. • Familiarity with Agile/Scrum methodologies. Good to Have: • Experience with other automation frameworks (Cypress, Selenium, Puppeteer). • Strong skills in REST API testing with Postman, Newman, or REST Assured. • Knowledge of cloud testing environments and containerization (Docker, Kubernetes).
Location: Offshore(India) / Onshore(USA) Experience: 8+ years Employment Type: Full-time Overview We are seeking an experienced Data Modeler with a strong background in healthcare data environments. The ideal candidate will have deep expertise in IBM UDMH , data modeling techniques, and healthcare data standards, while working closely with business and technical stakeholders to ensure data structures align with organizational needs. Required Skills & Experience 8+ years of experience as a Data Modeler or Data Architect . Strong hands-on experience with IBM UDMH in a healthcare environment. Expertise in ER modeling tools and methodologies. Proficiency in SQL with solid understanding of relational, dimensional, and normalized modeling techniques. Familiarity with healthcare data standards ( HL7, FHIR, ICD, CPT, etc. ) is a plus. Knowledge of data warehousing concepts and ETL processes . Strong communication skills for collaboration with both business and technical stakeholders. Nice-to-Have Experience with cloud data platforms (Azure, GCP). Exposure to big data ecosystems (Hadoop, Spark). Knowledge of data governance and Master Data Management (MDM) practices. About the Work Primary Focus Areas: Build and maintain robust healthcare data models using IBM UDMH, support enterprise data initiatives, and ensure alignment with industry standards. Secondary Focus Areas: Support governance, collaborate on data integration projects, and contribute to cloud and big data modernization initiatives. How to apply Please submit your resume at resourcing@bitsatom.com and cover letter highlighting your relevant experience and skills.
We are seeking a highly skilled and experienced Node.js & Next.js Backend Engineer to join our dynamic team. The ideal candidate will have 5 to 9 years of experience in backend development, with a strong focus on Node.js and Next.js technologies. Key Responsibilities: Develop and maintain scalable backend services using Node.js and Next.js. Collaborate with front-end developers to integrate user-facing elements with server-side logic. Ensure the performance, quality, and responsiveness of applications. Identify and resolve performance and scalability issues. Participate in code reviews and contribute to team knowledge sharing. Stay updated with the latest industry trends and technologies. Required Skills and Qualifications: 5-9 years of experience in backend development. Strong proficiency in Node.js and Next.js. Experience with RESTful APIs and GraphQL. Familiarity with database technologies such as MongoDB, MySQL, or PostgreSQL. Understanding of security and data protection principles. Excellent problem-solving skills and attention to detail. Strong communication skills and ability to work collaboratively in a team environment. If you are passionate about backend development and have a keen interest in working with cutting-edge technologies, we would love to hear from you. Apply now to join our innovative team!
Location: Remote Experience: 5+ years Employment Type: Full-time Shift: US Shift Overview We are seeking an experienced Apigee Integration Developer with strong expertise in designing, developing, and managing APIs and integration workflows using Google Cloud Apigee. The ideal candidate will have hands-on experience in API proxy creation, policy implementation, and connecting backend systems through secure and scalable integrations. You will collaborate closely with cross-functional teams to deliver seamless API solutions aligned with business and technical objectives. Required Skills & Experience 5+ years of experience in API and integration development using Apigee Edge or Apigee X. Proficiency in designing and implementing API proxies, shared flows, and custom policies. Strong understanding of RESTful APIs, JSON, XML, and HTTP protocols. Expertise in API security standards such as OAuth 2.0, JWT, and API key management. Experience in API lifecycle management – design, development, testing, deployment, and monitoring. Hands-on knowledge of backend integrations using databases, microservices, and third-party APIs. Solid understanding of CI/CD pipelines, Git, and deployment automation for APIs. Strong debugging and performance optimization skills for high-traffic APIs. Excellent communication and collaboration skills for working with distributed teams across time zones. Key Responsibilities Develop, deploy, and manage API proxies and integration flows using Apigee. Implement policies for security, transformation, traffic management, and mediation. Integrate APIs with backend services and ensure performance, scalability, and reliability. Configure analytics dashboards to monitor API usage, performance, and errors. Collaborate with architects, developers, and QA teams to ensure best practices in API design and governance. Participate in code reviews, design discussions, and continuous improvement initiatives. Nice-to-Have Experience with Google Cloud Platform (GCP) and Apigee hybrid deployments. Exposure to other integration tools (MuleSoft, Dell Boomi, or Azure Logic Apps). Familiarity with containerization (Docker, Kubernetes) and microservices architecture. Knowledge of event-driven integrations using Pub/Sub, Kafka, or messaging queues. Apigee or API Management certification (Google Cloud or equivalent). How to apply Please submit your resume at resourcing@bitsatom.com and cover letter highlighting your relevant experience and skills.
Location: Remote Experience : 5+ years Employment Type : Full-time Shift: US Shift Overview We are seeking an experienced Apigee Integration Developer with strong expertise in designing, developing, and managing APIs and integration workflows using Google Cloud Apigee. The ideal candidate will have hands-on experience in API proxy creation, policy implementation, and connecting backend systems through secure and scalable integrations. You will collaborate closely with cross-functional teams to deliver seamless API solutions aligned with business and technical objectives. Required Skills & Experience 5+ years of experience in API and integration development using Apigee Edge or Apigee X . Proficiency in designing and implementing API proxies, shared flows, and custom policies . Strong understanding of RESTful APIs, JSON, XML, and HTTP protocols . Expertise in API security standards such as OAuth 2.0, JWT, and API key management. Experience in API lifecycle management – design, development, testing, deployment, and monitoring. Hands-on knowledge of backend integrations using databases, microservices, and third-party APIs. Solid understanding of CI/CD pipelines , Git , and deployment automation for APIs. Strong debugging and performance optimization skills for high-traffic APIs. Excellent communication and collaboration skills for working with distributed teams across time zones. Key Responsibilities Develop, deploy, and manage API proxies and integration flows using Apigee. Implement policies for security, transformation, traffic management, and mediation. Integrate APIs with backend services and ensure performance, scalability, and reliability. Configure analytics dashboards to monitor API usage, performance, and errors. Collaborate with architects, developers, and QA teams to ensure best practices in API design and governance. Participate in code reviews , design discussions, and continuous improvement initiatives. Nice-to-Have Experience with Google Cloud Platform (GCP) and Apigee hybrid deployments. Exposure to other integration tools (MuleSoft, Dell Boomi, or Azure Logic Apps). Familiarity with containerization (Docker, Kubernetes) and microservices architecture . Knowledge of event-driven integrations using Pub/Sub, Kafka, or messaging queues . Apigee or API Management certification (Google Cloud or equivalent). How to apply Please submit your resume at resourcing@bitsatom.com and cover letter highlighting your relevant experience and skills.
Location: Remote Experience: 5+ years Employment Type: Full-time Shift: US Shift Overview : We are seeking an experienced Apigee Integration Developer with strong expertise in designing, developing, and managing APIs and integration workflows using Google Cloud Apigee. The ideal candidate will have hands-on experience in API proxy creation, policy implementation, and connecting backend systems through secure and scalable integrations. You will collaborate closely with cross-functional teams to deliver seamless API solutions aligned with business and technical objectives. Required Skills & Experience : 5+ years of experience in API and integration development using Apigee Edge or Apigee X. Proficiency in designing and implementing API proxies, shared flows, and custom policies. Strong understanding of RESTful APIs, JSON, XML, and HTTP protocols. Expertise in API security standards such as OAuth 2.0, JWT, and API key management. Experience in API lifecycle management – design, development, testing, deployment, and monitoring. Hands-on knowledge of backend integrations using databases, microservices, and third-party APIs. Solid understanding of CI/CD pipelines, Git, and deployment automation for APIs. Strong debugging and performance optimization skills for high-traffic APIs. Excellent communication and collaboration skills for working with distributed teams across time zones. Key Responsibilities Develop, deploy, and manage API proxies and integration flows using Apigee. Implement policies for security, transformation, traffic management, and mediation. Integrate APIs with backend services and ensure performance, scalability, and reliability. Configure analytics dashboards to monitor API usage, performance, and errors. Collaborate with architects, developers, and QA teams to ensure best practices in API design and governance. Participate in code reviews, design discussions, and continuous improvement initiatives. Nice-to-Have Experience with Google Cloud Platform (GCP) and Apigee hybrid deployments. Exposure to other integration tools (MuleSoft, Dell Boomi, or Azure Logic Apps). Familiarity with containerization (Docker, Kubernetes) and microservices architecture. Knowledge of event-driven integrations using Pub/Sub, Kafka, or messaging queues. Apigee or API Management certification (Google Cloud or equivalent). How to apply : Please submit your resume at resourcing@bitsatom.com and cover letter highlighting your relevant experience and skills.
Type: Freelance (with potential for full-time job opportunity) Pay: Up to ₹60/hr Location: Remote About the Role: We’re looking for a creative and detail-oriented Video Editor & Motion Graphic Designer who can craft visually stunning, high-end videos with smooth transitions, dynamic animations, and strong storytelling impact. You will be responsible for editing videos, adding motion graphics, and designing visually appealing sequences that elevate brand communication across digital platforms. Key Responsibilities: Edit and assemble raw footage into polished, engaging final videos. Design and integrate high-end motion graphics and visual effects to enhance storytelling. Add premium-quality transitions , text animations, and overlays. Collaborate with the creative and marketing teams to deliver brand-aligned content. Work efficiently to meet project deadlines while maintaining top-notch quality. Requirements: Proven experience in video editing and motion graphics . Expertise in Adobe After Effects (mandatory). Strong skills in Premiere Pro , Photoshop , or similar Adobe tools. Understanding of pacing, composition, and cinematic aesthetics. Ability to create eye-catching transitions , typography animations, and dynamic visuals. Portfolio showcasing motion design and high-quality edited videos. Preferred Skills (Bonus): Knowledge of color grading and sound design. Experience creating social media ads, reels, or brand films. Familiarity with 3D or VFX tools is a plus. What We Offer: Freelance opportunity with flexible working hours. Chance to transition into a full-time creative role based on performance. Work on premium-level video projects with creative freedom. Collaborative and growth-oriented team culture.
Location: Gurugram Employment Type: Full-Time Onsite Experience Required: 6-8 Years About the Role: We are seeking a visionary and hands-on Senior AI Technical Lead to spearhead our Generative AI initiatives focusing on conversational bot development, prompt engineering, and scalable AI solutions. This role demands deep technical expertise, strategic thinking, and the ability to lead and mentor a high-performing team in a fast-paced, innovation-driven environment. You will be at the forefront of designing and deploying cutting-edge GenAI systems that redefine how users interact with intelligent agents. Key Responsibilities: AI & GenAI Design and implement scalable GenAI architectures for conversational bots and intelligent agents Define the workflows for agent and conversational bots to solve various business problems Architect solutions for multi-modal AI (text, image, voice) and retrieval-augmented generation (RAG) Integrate the bots with various channel, including, WhatzApp, Instagram, websites, mobile, etc Prompt Engineering & Optimization Develop and refine advanced prompt strategies for LLMs to ensure high-quality, context-aware responses for bots and agents Build reusable prompt libraries and frameworks for internal use Evaluate prompt performance using metrics like coherence, relevance, and factuality Architecture & Patterns Understand multilayered application architecture, micro-services, APIs and Kubernetes Understand integration with multiple channels and communication platforms, for example, WhatsApp, Instagram, Facebook Prior experience working on integration platforms, session management, data integration challenges, etc Understand concepts around performance optimization across devices Understand the overall release process and support teams in building a seamless release process. Technical Leadership Lead a team of 10 - 12 engineers, data scientists, and designers through agile development cycles Drive technical roadmap, sprint planning, and code reviews Foster a culture of innovation, experimentation, and continuous learning Ability to drive delivery of applications working collaboratively with various stakeholders Training and Evaluation Work with various stakeholders to identify the data, versions, and ingestion of data into RAG Create automation pipelines to process data for ingestion into RAG databases Work on migration of data, validation, security and encryption Stakeholder Collaboration Work closely with product managers, UX designers & business leaders to align AI capabilities with user needs Translate business requirements into technical specifications and deliverables Required Skills & Qualifications Education Bachelor’s or Master’s degree in Computer Science, AI/ML, Data Science, or related field Technical Expertise 8–10 years of experience in AI/ML development, with at least 2–3 years in GenAI or LLM-based systems Proficiency in Python, LangChain, Hugging Face Transformers Understanding of bot frameworks such as RASA, BotPress or similar open or closed source frameworks Deep understanding of NLP, LLMs, embeddings, vector databases Experience with cloud platforms (Azure) and containerization (Docker, Kubernetes) Strong understanding of programming languages, such as python How to applyPlease submit your resume at resourcing@bitsatom.com and cover letter highlighting your relevant experience and skills.
Location: Gurugram Employment Type: Full-Time Onsite Experience Required: 6-8 Years About the Role: We are seeking a visionary and hands-on Senior AI Technical Lead to spearhead our Generative AI initiatives focusing on conversational bot development, prompt engineering, and scalable AI solutions. This role demands deep technical expertise, strategic thinking, and the ability to lead and mentor a high-performing team in a fast-paced, innovation-driven environment. You will be at the forefront of designing and deploying cutting-edge GenAI systems that redefine how users interact with intelligent agents. Key Responsibilities: AI & GenAI Design and implement scalable GenAI architectures for conversational bots and intelligent agents Define the workflows for agent and conversational bots to solve various business problems Architect solutions for multi-modal AI (text, image, voice) and retrieval-augmented generation (RAG) Integrate the bots with various channel, including, WhatzApp, Instagram, websites, mobile, etc Prompt Engineering & Optimization Develop and refine advanced prompt strategies for LLMs to ensure high-quality, context-aware responses for bots and agents Build reusable prompt libraries and frameworks for internal use Evaluate prompt performance using metrics like coherence, relevance, and factuality Architecture & Patterns Understand multilayered application architecture, micro-services, APIs and Kubernetes Understand integration with multiple channels and communication platforms, for example, WhatsApp, Instagram, Facebook Prior experience working on integration platforms, session management, data integration challenges, etc Understand concepts around performance optimization across devices Understand the overall release process and support teams in building a seamless release process. Technical Leadership Lead a team of 10 - 12 engineers, data scientists, and designers through agile development cycles Drive technical roadmap, sprint planning, and code reviews Foster a culture of innovation, experimentation, and continuous learning Ability to drive delivery of applications working collaboratively with various stakeholders Training and Evaluation Work with various stakeholders to identify the data, versions, and ingestion of data into RAG Create automation pipelines to process data for ingestion into RAG databases Work on migration of data, validation, security and encryption Stakeholder Collaboration Work closely with product managers, UX designers & business leaders to align AI capabilities with user needs Translate business requirements into technical specifications and deliverables Required Skills & Qualifications Education Bachelor’s or Master’s degree in Computer Science, AI/ML, Data Science, or related field Technical Expertise 8–10 years of experience in AI/ML development, with at least 2–3 years in GenAI or LLM-based systems Proficiency in Python, LangChain, Hugging Face Transformers Understanding of bot frameworks such as RASA, BotPress or similar open or closed source frameworks Deep understanding of NLP, LLMs, embeddings, vector databases Experience with cloud platforms (Azure) and containerization (Docker, Kubernetes) Strong understanding of programming languages, such as python How to applyPlease submit your resume at resourcing@bitsatom.com and cover letter highlighting your relevant experience and skills.
Location: Offshore(India) / Onshore(USA) Experience: 8+ years Employment Type: Full-time Overview We are seeking an experienced Data Modeler with a strong background in healthcare data environments. The ideal candidate will have deep expertise in IBM UDMH, data modeling techniques, and healthcare data standards, while working closely with business and technical stakeholders to ensure data structures align with organizational needs. Required Skills & Experience 8+ years of experience as a Data Modeler or Data Architect. Strong hands-on experience with IBM UDMH in a healthcare environment. Expertise in ER modeling tools and methodologies. Proficiency in SQL with solid understanding of relational, dimensional, and normalized modeling techniques. Familiarity with healthcare data standards (HL7, FHIR, ICD, CPT, etc.) is a plus. Knowledge of data warehousing concepts and ETL processes. Strong communication skills for collaboration with both business and technical stakeholders. Nice-to-Have Experience with cloud data platforms (Azure, GCP). Exposure to big data ecosystems (Hadoop, Spark). Knowledge of data governance and Master Data Management (MDM) practices. About the Work Primary Focus Areas: Build and maintain robust healthcare data models using IBM UDMH, support enterprise data initiatives, and ensure alignment with industry standards. Secondary Focus Areas: Support governance, collaborate on data integration projects, and contribute to cloud and big data modernization initiatives. How to apply Please submit your resume at resourcing@bitsatom.com and cover letter highlighting your relevant experience and skills.
Location: Gurugram Employment Type: Full-Time Onsite Experience Required: 6-8 Years About the Role: We are seeking a visionary and hands-on Senior AI Technical Lead to spearhead our Generative AI initiatives focusing on conversational bot development, prompt engineering, and scalable AI solutions. This role demands deep technical expertise, strategic thinking, and the ability to lead and mentor a high-performing team in a fast-paced, innovation-driven environment. You will be at the forefront of designing and deploying cutting-edge GenAI systems that redefine how users interact with intelligent agents. Key Responsibilities: AI & GenAI Design and implement scalable GenAI architectures for conversational bots and intelligent agents Define the workflows for agent and conversational bots to solve various business problems Architect solutions for multi-modal AI (text, image, voice) and retrieval-augmented generation (RAG) Integrate the bots with various channel, including, WhatzApp, Instagram, websites, mobile, etc Prompt Engineering & Optimization Develop and refine advanced prompt strategies for LLMs to ensure high-quality, context-aware responses for bots and agents Build reusable prompt libraries and frameworks for internal use Evaluate prompt performance using metrics like coherence, relevance, and factuality Architecture & Patterns Understand multilayered application architecture, micro-services, APIs and Kubernetes Understand integration with multiple channels and communication platforms, for example, WhatsApp, Instagram, Facebook Prior experience working on integration platforms, session management, data integration challenges, etc Understand concepts around performance optimization across devices Understand the overall release process and support teams in building a seamless release process. Technical Leadership Lead a team of 10 - 12 engineers, data scientists, and designers through agile development cycles Drive technical roadmap, sprint planning, and code reviews Foster a culture of innovation, experimentation, and continuous learning Ability to drive delivery of applications working collaboratively with various stakeholders Training and Evaluation Work with various stakeholders to identify the data, versions, and ingestion of data into RAG Create automation pipelines to process data for ingestion into RAG databases Work on migration of data, validation, security and encryption Stakeholder Collaboration Work closely with product managers, UX designers & business leaders to align AI capabilities with user needs Translate business requirements into technical specifications and deliverables Required Skills & Qualifications Education Bachelor’s or Master’s degree in Computer Science, AI/ML, Data Science, or related field Technical Expertise 8–10 years of experience in AI/ML development, with at least 2–3 years in GenAI or LLM-based systems Proficiency in Python, LangChain, Hugging Face Transformers Understanding of bot frameworks such as RASA, BotPress or similar open or closed source frameworks Deep understanding of NLP, LLMs, embeddings, vector databases Experience with cloud platforms (Azure) and containerization (Docker, Kubernetes) Strong understanding of programming languages, such as python How to applyPlease submit your resume at resourcing@bitsatom.com and cover letter highlighting your relevant experience and skills.
Location: Gurugram Employment Type: Full-Time Onsite Experience Required: 6-8 Years About the Role: We are seeking a visionary and hands-on Senior AI Technical Lead to spearhead our Generative AI initiatives focusing on conversational bot development, prompt engineering, and scalable AI solutions. This role demands deep technical expertise, strategic thinking, and the ability to lead and mentor a high-performing team in a fast-paced, innovation-driven environment. You will be at the forefront of designing and deploying cutting-edge GenAI systems that redefine how users interact with intelligent agents. Key Responsibilities: AI & GenAI Design and implement scalable GenAI architectures for conversational bots and intelligent agents Define the workflows for agent and conversational bots to solve various business problems Architect solutions for multi-modal AI (text, image, voice) and retrieval-augmented generation (RAG) Integrate the bots with various channel, including, WhatzApp, Instagram, websites, mobile, etc Prompt Engineering & Optimization Develop and refine advanced prompt strategies for LLMs to ensure high-quality, context-aware responses for bots and agents Build reusable prompt libraries and frameworks for internal use Evaluate prompt performance using metrics like coherence, relevance, and factuality Architecture & Patterns Understand multilayered application architecture, micro-services, APIs and Kubernetes Understand integration with multiple channels and communication platforms, for example, WhatsApp, Instagram, Facebook Prior experience working on integration platforms, session management, data integration challenges, etc Understand concepts around performance optimization across devices Understand the overall release process and support teams in building a seamless release process. Technical Leadership Lead a team of 10 - 12 engineers, data scientists, and designers through agile development cycles Drive technical roadmap, sprint planning, and code reviews Foster a culture of innovation, experimentation, and continuous learning Ability to drive delivery of applications working collaboratively with various stakeholders Training and Evaluation Work with various stakeholders to identify the data, versions, and ingestion of data into RAG Create automation pipelines to process data for ingestion into RAG databases Work on migration of data, validation, security and encryption Stakeholder Collaboration Work closely with product managers, UX designers & business leaders to align AI capabilities with user needs Translate business requirements into technical specifications and deliverables Required Skills & Qualifications Education Bachelor’s or Master’s degree in Computer Science, AI/ML, Data Science, or related field Technical Expertise 8–10 years of experience in AI/ML development, with at least 2–3 years in GenAI or LLM-based systems Proficiency in Python, LangChain, Hugging Face Transformers Understanding of bot frameworks such as RASA, BotPress or similar open or closed source frameworks Deep understanding of NLP, LLMs, embeddings, vector databases Experience with cloud platforms (Azure) and containerization (Docker, Kubernetes) Strong understanding of programming languages, such as python How to applyPlease submit your resume at resourcing@bitsatom.com and cover letter highlighting your relevant experience and skills.