Bengaluru, Karnataka, India
Not disclosed
On-site
Full Time
Having 6 to 10 years of experience in the area of SAP Warehouse Management & Logistic SAP Functional consultant expertise on MM/SD, Logistics (WMS/TMS/HUM) Good exposure on warehouse management and logistics configurations Proficiency in interpreting customer requirements and system issues and providing solution. Strong Problem-Solving skills and experience engaging Vendors/Partners to provide business continuity. Proficiency with English written & verbal communication Preferred Skills: Extensively worked on handling unit management with RF process functions like goods receipt, goods issue, scrapping, physical inventory process, stock type changes. Experience with Interface with robotic logistics solution tools with SAP. Experience on bar code labels, 2D, 3D and QR Scan codes (good to have) Experience on smart forms, authoring FS, and able to handle enhancements. Good knowledge on batch management functions. Knowledge of MM and other interactive modules of SAP (good to have) Knowledge in data migration tools such as, LSMW, LTMC, mass upload programs. Show more Show less
Noida, Uttar Pradesh, India
Not disclosed
On-site
Full Time
Hello , Job Title: QA Automation Engineer Location: Bangalore (hybrid) Job Type: [Full-time] Department: Quality Assurance / Engineering About the Role: We are seeking a skilled QA Automation Engineer to join our growing team. In this role, you will be responsible for designing, developing, and executing automated tests to ensure product quality across web services and UI applications. You will work closely with developers, product managers, and DevOps engineers to build and maintain a robust and scalable testing framework. Key Responsibilities: Design and implement automated test cases using Java , Selenium , and Rest Assured . Develop and maintain API test scripts using Postman or Rest Assured for comprehensive coverage. Integrate automated tests into CI/CD pipelines (e.g., Jenkins, GitLab CI, CircleCI). Participate in sprint planning, requirements reviews, and design discussions. Analyze test results, identify bugs, and work with developers to resolve issues. Collaborate with cross-functional teams to define test strategies and ensure quality throughout the development lifecycle. Maintain documentation related to test cases, scripts, and test plans. Continuously enhance the test automation framework for reliability and performance. Required Skills and Qualifications: Bachelor's degree in Computer Science, Engineering, or related field. 3+ years of experience in QA Automation or Software Testing. Strong experience with Java and automation frameworks (TestNG, JUnit). Proficient in Selenium WebDriver for UI automation. Hands-on experience with Rest Assured for REST API testing. Experience using Postman for API validation and test collections. Familiarity with integrating tests into CI/CD tools (e.g., Jenkins, GitHub Actions, GitLab). Good understanding of software development lifecycle and Agile methodologies. Excellent analytical and problem-solving skills. Strong written and verbal communication skills. Thanks & Regards, Siddharth Gupta Talent Acquisition Specialist Mobile - +919220295723 Show more Show less
Pune, Maharashtra, India
None Not disclosed
On-site
Full Time
Job Title: Data Engineer, EDM IV Location: Bangalore (Hybrid)/ Pan India Shift : General Shift Experience : 6 - 8 Years of relevant experience We are looking for an experienced Senior Data Engineer to join our Marketing Data Engineering Team. Reporting to Manager, Data Engineering, and is hybrid position in Bengaluru. Data Engineer will join the team to expand and optimize our data and data pipeline architecture and optimizing data flow and collection for cross-functional teams. Should be an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up. The Data Engineer will support Software Developers, Data Quality Engineers, Data Analysts and Data Scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. Responsibilities Create and maintain optimal data pipeline architecture. Assemble complex data sets that meet functional / non-functional requirements. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL, DBT and AWS 'big data' technologies. Build analytics tools that utilize the data pipeline to provide actionable insights into employee experience, operational efficiency, and other key business performance metrics. Work with stakeholders to assist with data-related technical issues and support associated data infrastructure needs. Build processes supporting data transformation, data structures, metadata, dependency and workload management. Keep up to date with the latest and greatest in feature-sets and capabilities from public cloud providers (such as AWS and Azure) and find ways to apply them back help their team Work with data scientists and analysts to strive for greater functionality in our data systems. Minimum Qualifications We are looking for a candidate with 5+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. 5+ years of hands-on experience in Snowflake 5+ years of working in DBT with knowledge on advanced DBT concepts like macros and Jinja templating. Advanced working SQL experience working with relational databases, query authoring (SQL) and working familiarity with a variety of databases. Experience with scripting languages such as Python Experience with big data tools such as PySpark Experience with AWS cloud services used often for data engineering including S3, EC2, Glue, Lambda, RDS, or Redshift Experience working with APIs to pull and push data. Experience optimizing 'big data' data pipelines, architectures and data sets. Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. Preferred Qualification Experience working with AWS CloudFormation templates is a plus Familiarity with Agile and SCRUM methodologies is a plus Experience working with PowerBI to develop dashboards is a plus Analytical skills related to working with unstructured datasets. A successful history of processing value from large, disconnected datasets. Experience working with agile, globally distributed teams.
noida, uttar pradesh
INR Not disclosed
On-site
Full Time
You will be a full-time hybrid SAP BPC Embedded Consultant based in Noida, with the flexibility for some work-from-home. Your main responsibilities will include configuring, developing, and providing support for SAP BPC solutions on a day-to-day basis. This will involve designing and implementing planning and consolidation models, troubleshooting issues, collaborating with business stakeholders to gather requirements, and ensuring the integrity of the system. Furthermore, you will be expected to offer training to users and produce documentation for the solutions that have been implemented. To excel in this role, you should possess SAP BPC Embedded implementation and configuration skills, as well as experience with planning and consolidation models in SAP BPC. You should be proficient in troubleshooting and resolving SAP BPC-related issues, have a solid understanding of business processes and requirements gathering, and demonstrate excellent communication and collaboration skills. Knowledge of SAP BW, SAP HANA, or other related technologies will be advantageous. The role requires the ability to work independently and in a hybrid work environment. A Bachelor's degree in Computer Science, Information Technology, or a related field is a prerequisite, and experience in multiple full lifecycle SAP BPC implementations would be a valuable asset.,
noida, uttar pradesh
INR Not disclosed
On-site
Full Time
You should have extensive experience in UI system design and related concepts, along with skills in architecting applications and developing reusable components. Your expertise in Node.js should be strong, and you should be proficient in JavaScript (ES6+), TypeScript, React, Next.js, HTML, and CSS. It is important to be familiar with commonly used React libraries such as Redux, React Hooks, functional components, Axios, and React Spring. A solid understanding of RESTful APIs is necessary, as well as knowledge of modern authentication methods including JSON Web Tokens (JWT), OAuth, and Basic Authentication. You should be focused on optimizing performance for both new and existing applications. Experience with Docker and ECS / AWS is preferred for this role.,
Delhi, India
None Not disclosed
On-site
Full Time
Company Description Quarks Technosoft is a leading digital engineering and enterprise modernization company powered by AI. We collaborate with leading innovators across various sectors, including Finance, Banking, Education, Telecom, E-commerce, and Media. Our diverse clientele comprises top-tier organizations with extensive global reach. Quarks offers bespoke solutions that drive cost efficiency and enable the integration of next-generation technologies. With a strong commitment to client success, we have achieved remarkable growth, maintaining high standards of quality and security. Our culture promotes innovation, diversity, and continuous learning, making Quarks a Great Place to Work with a high employee satisfaction rating. Role Description This is a full-time on-site role for a Salesforce CPQ Lead Developer based in Delhi, India. The Salesforce CPQ Lead Developer will be responsible for designing, developing, and implementing customized solutions within the Salesforce CPQ platform. Daily tasks include analyzing business requirements, creating technical specifications, coding, testing, and deploying solutions. The role also involves providing ongoing support and maintenance, troubleshooting issues, and ensuring the platform’s performance and reliability. Collaboration with cross-functional teams to optimize Salesforce configurations and improve processes will also be a key part of the role. Qualifications Proficiency in Salesforce CPQ development and customization Strong understanding of Salesforce platform, Apex, Visualforce, and Lightning components Experience in configuring product bundles, pricing rules, and approval processes Ability to analyze business requirements and translate them into technical specifications Excellent problem-solving and troubleshooting skills Effective communication and collaboration skills Salesforce CPQ Certification is a plus At least 6 years of experience in Salesforce CPQ development Bachelor's degree in Computer Science or related field
Greater Kolkata Area
None Not disclosed
On-site
Full Time
Job Title : Data Scientist Experience : 6 to 10 Years Location : Noida, Bangalore, Pune Employment Type : Full-time Job Summary We are seeking a highly skilled and experienced Data Scientist with a strong background in Natural Language Processing (NLP), Generative AI, and Large Language Models (LLMs). The ideal candidate will be proficient in Python and have hands-on experience working with both Google Cloud Platform (GCP) and Amazon Web Services (AWS). You will play a key role in designing, developing, and deploying AI-driven solutions to solve complex business problems. Key Responsibilities Design and implement NLP and Generative AI models for use cases such as chatbots, text summarization, question answering, and information extraction. Fine-tune and deploy Large Language Models (LLMs) using frameworks such as Hugging Face Transformers or LangChain. Conduct experiments, evaluate model performance, and implement improvements for production-scale solutions. Collaborate with cross-functional teams including product managers, data engineers, and ML engineers. Deploy and manage ML models on cloud platforms (GCP and AWS), using services such as Vertex AI, SageMaker, Lambda, Cloud Functions, etc. Build and maintain ML pipelines for training, validation, and deployment using CI/CD practices. Communicate complex technical findings in a clear and concise manner to both technical and non-technical stakeholders. Required Skills Strong proficiency in Python and common data science/ML libraries (NumPy, Pandas, Scikit-learn, TensorFlow, PyTorch). Proven experience in Natural Language Processing (NLP) techniques (NER, sentiment analysis, embeddings, topic modeling, etc.). Hands-on experience with Generative AI and LLMs (e.g., GPT, BERT, T5, LLaMA, Claude, Gemini). Experience with LLMOps, prompt engineering, and fine-tuning pre-trained language models. Experience with GCP (BigQuery, Vertex AI, Cloud Functions, etc.) and/or AWS (SageMaker, S3, Lambda, etc.). Familiarity with containerization (Docker), orchestration (Kubernetes), and model deployment best practices (ref:hirist.tech)
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.