Jobs
Interviews
Sagclay

Sagclay is a forward-thinking company specializing in sustainable clay products primarily for construction and art applications. The company focuses on innovation and eco-friendliness.

4 Job openings at Sagclay
Salesforce Developer

Hyderabad, Chennai, Bengaluru

5 - 10 years

INR 10.0 - 20.0 Lacs P.A.

Work from Office

Full Time

Salesforce Development & Customization: Design, develop, and implement custom solutions using Apex, Visualforce, and Lightning components. Customize Salesforce applications and Vlocity industry-specific features for clients. Write and maintain clean, efficient, and reusable code. Vlocity Implementation: Implement Vlocitys data model, templates, and automation processes. Customize Vlocity OmniStudio components, including FlexCards, OmniScripts, and DataRaptors. Integration & APIs: Integrate Salesforce with external systems using REST/SOAP APIs and middleware platforms. Collaboration & Communication: Work closely with Salesforce Administrators, Business Analysts, and stakeholders to define requirements and deliver solutions. Conduct code reviews and ensure best practices are followed in development. Data Management: Ensure data integrity, manage data migrations, and build data models in Salesforce and Vlocity environments. Testing & Deployment: Create and execute test plans, perform unit testing, and ensure solutions meet the requirements. Deploy Salesforce and Vlocity configurations and customizations through the development lifecycle (e.g., from sandboxes to production). Maintenance & Troubleshooting: Maintain and troubleshoot existing Salesforce and Vlocity applications. Perform root cause analysis and resolve issues in a timely manner. Required Skills & Qualifications: Education: Bachelors degree in Computer Science, Engineering, Information Systems, or a related field (or equivalent experience). Strong analytical and troubleshooting abilities, with a passion for delivering high-quality solutions.

Java Developer

Chennai

5 - 10 years

INR 10.0 - 20.0 Lacs P.A.

Work from Office

Full Time

Key Responsibilities: Design and develop microservices-based applications using Spring Boot. Collaborate with cross-functional teams to define and implement new features and enhancements. Ensure high performance and responsiveness of applications by writing clean, maintainable code. Implement and maintain RESTful APIs and integrate them with front-end services. Participate in architectural discussions and contribute to best practices for microservices design. Conduct code reviews and mentor junior developers to improve team performance. Monitor and troubleshoot application performance issues in production environments. Stay updated with the latest industry trends and technologies to drive continuous improvement. Experience in Java development, with a strong focus on Spring Boot. Extensive experience in designing and implementing microservices architecture. Proficiency in RESTful web services and API design. Familiarity with containerization and orchestration technologies (e.g., Docker, Kubernetes). Experience with cloud platforms (e.g., AWS, Azure) is preferred. Strong understanding of relational databases (e.g., MySQL, Oracle) and NoSQL databases (e.g., MongoDB).

Accounts Payable Professional

Chennai

1 - 6 years

INR 2.0 - 7.0 Lacs P.A.

Hybrid

Full Time

Key Responsibilities: Invoice Processing: Review and process vendor invoices, ensuring accuracy and completeness of information. Verify invoices against purchase orders, contracts, and other supporting documents. Ensure timely payment of invoices, taking into account payment terms and discounts. Vendor Management: Maintain accurate and up-to-date vendor information, including contact details and payment terms. Develop and maintain strong relationships with vendors, resolving any issues or discrepancies in a timely and professional manner. Payment Processing: Prepare and process payment batches, ensuring accuracy and completeness of payment information. Ensure compliance with company policies and procedures, as well as relevant laws and regulations. Reconciliations and Audits: Perform regular reconciliations of vendor statements and accounts payable ledgers. Assist with internal and external audits, providing documentation and information as required. Reporting and Analysis: Prepare and analyze accounts payable reports, including aging reports and payment schedules. Identify and investigate discrepancies, making recommendations for process improvements. Compliance and Risk Management: Ensure compliance with company policies, procedures, and relevant laws and regulations. Identify and mitigate potential risks, implementing controls and process improvements as needed.

Data Engineer

Chennai

5 - 10 years

INR 20.0 - 35.0 Lacs P.A.

Work from Office

Full Time

Development: Design, build, and maintain robust, scalable, and high-performance data pipelines to ingest, process, and store large volumes of structured and unstructured data. Utilize Apache Spark within Databricks to process big data efficiently, leveraging distributed computing to process large datasets in parallel. Integrate data from a variety of internal and external sources, including databases, APIs, cloud storage, and real-time streaming data. Data Integration & Storage: Implement and maintain data lakes and warehouses, using technologies like Databricks, Azure Synapse, Redshift, BigQuery to store and retrieve data. Design and implement data models, schemas, and architecture for efficient querying and storage. Data Transformation & Optimization: Leverage Databricks and Apache Spark to perform data transformations at scale, ensuring data is cleaned, transformed, and optimized for analytics. Write and optimize Spark SQL, PySpark, and Scala code to process large datasets in real-time and batch jobs. Work on ETL processes to extract, transform, and load data from various sources into cloud-based data environments. Big Data Tools & Technologies: Utilize cloud-based big data platforms (e.g., AWS, Azure, Google Cloud) in conjunction with Databricks for distributed data processing and storage. Implement and maintain data pipelines using Apache Kafka, Apache Flink, and other data streaming technologies for real-time data processing. Collaboration & Stakeholder Engagement: Work with data scientists, data analysts, and business stakeholders to define data requirements and deliver solutions that align with business objectives. Collaborate with cloud engineers, data architects, and other teams to ensure smooth integration and data flow between systems. Monitoring & Automation: Build and implement monitoring solutions for data pipelines, ensuring consistent performance, identifying issues, and optimizing workflows. Automate data ingestion, transformation, and validation processes to reduce manual intervention and increase efficiency. Document data pipeline processes, architectures, and data models to ensure clarity and maintainability. Adhere to best practices in data engineering, software development, version control, and code review. Required Skills & Qualifications: Education: Bachelors degree in Computer Science, Engineering, Data Science, or a related field (or equivalent experience). Technical Skills: Apache Spark: Strong hands-on experience with Spark, specifically within Databricks (PySpark, Scala, Spark SQL). Experience working with cloud-based platforms such as AWS, Azure, or Google Cloud, particularly in the context of big data processing and storage. Proficiency in SQL and experience with cloud data warehouses (e.g., Redshift, BigQuery, Snowflake). Strong programming skills in Python, Scala, or Java. Big Data & Cloud Technologies: Experience with distributed computing concepts and scalable data processing architectures. Familiarity with data lake architectures and frameworks (e.g., AWS S3, Azure Data Lake). Data Engineering Concepts: Strong understanding of ETL processes, data modeling, and database design. Experience with batch and real-time data processing techniques. Familiarity with data quality, data governance, and privacy regulations. Problem Solving & Analytical Skills: Strong troubleshooting skills for resolving issues in data pipelines and performance optimization. Ability to work with large, complex datasets, and perform data wrangling and cleaning.

FIND ON MAP

Sagclay

Sagclay logo

Sagclay

|

Manufacturing

Claytown

50-100 Employees

4 Jobs

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Job Titles Overview