Jobs
Interviews
4 Job openings at Saradysol Tech Ventures Private Limited
Data Warehouse Analyst/Data Engineer

India

5 years

Not disclosed

Remote

Full Time

About the job Job description Job Title: Data Engineer (Power BI & MS BI Stack - SSRS, SSAS, SSMS, SSIS) Experience Level: 5+ Years (Immediate Joiners are Preferred) Location: Bengaluru Employment Type: Full-Time -Remote Job Description: We are seeking a highly skilled and experienced Data Warehouse Analyst / Data Engineer to support the end-to-end lifecycle of enterprise data platforms, including data warehouses , data lakes , and data marts , ensuring high performance, data integrity, and security. The ideal candidate will have deep expertise in data modeling , ETL processes , metadata management , and business intelligence systems , along with the ability to deliver robust data presentation layers and actionable insights. Experience with Databricks is a strong plus. Key Responsibilities: Design, develop, and maintain enterprise-wide data warehouse platforms , ensuring optimal performance , security , and scalability . Develop and optimize ETL processes using MS SQL Server Integration Services (SSIS) , T-SQL , and other tools. Build and maintain data models , metadata catalogs , and ER diagrams to support analytical needs. Proactively monitor system performance and data integrity; conduct preventive and corrective maintenance . Work with cross-functional teams to interpret data and create meaningful business intelligence reports using tools such as Power BI . Leverage Python , PySpark , and JSON for advanced data processing , transformation , and integration . Create and manage data repositories and presentation layers aggregating information across diverse source systems. Contribute to the development and optimization of cloud-based data platforms . Implement and promote best practices in data architecture , data security , and governance . Work with Databricks to support scalable data processing pipelines and advanced analytics workflows (preferred experience). Qualifications: Bachelor’s degree in Computer Science , Information Technology , or a related field. 5+ years of experience in data warehouse development , ETL , and BI systems . Strong background in data modeling , data architecture , and metadata management . Proficient in SQL , ETL tools , and database performance tuning. Hands-on experience with the MS BI Stack : SSIS , SSRS , SSAS , and SSMS . Proficiency in analytics tools such as Power BI . Experience with cloud data platforms (Azure, AWS, or GCP) is preferred. Experience with Databricks is a strong plus . Proficiency in Python , PySpark , and working with structured/unstructured data . Skills and Competencies: Excellent problem-solving , debugging , and optimization skills. Strong collaboration and communication skills to engage with business and technical teams. Ability to synthesize complex data into meaningful insights and visualizations . Familiarity with data governance , compliance , and security standards . Why Join Us? Opportunity to work with cutting-edge technology and tools. A collaborative and inclusive work culture. Opportunities for professional growth and certification sponsorship. Flexible working hours and hybrid work options. How to Apply: If you’re ready to take the next step in your career, send your updated resume to [careers@saradysol.com] with the subject line: “ Data Analytics Engineer/Data Analyst Role”. 💼 Alternatively, use LinkedIn’s Easy Apply feature to connect with us effortlessly! Be part of a team that leverages technology to achieve excellence in quality assurance. Let’s shape the future together! 🚀 Show more Show less

Senior DevOps Engineer / AWS Solutions Architect

India

10 years

Not disclosed

Remote

Full Time

Job Title: Senior DevOps Engineer / AWS Solutions Architect Location: Remote Job Type: Full-Time / Contract Experience : 10+ years in AWS Cloud Services, DevOps, Cloud Strategy, Security, and Networking Education : Bachelor’s degree in Computer Science, Engineering, or a related field preferred (or equivalent professional experience) Certifications : AWS Certified Solutions Architect – Professional (or equivalent DevOps certification) Job Overview We are seeking an experienced Senior DevOps Engineer or AWS Solutions Architect to lead the development, optimization, and implementation of cloud-based solutions on AWS. The ideal candidate will possess over a decade of expertise in cloud architecture, security, networking, and DevOps, with a focus on driving organizational cloud strategies. This role is critical in guiding our teams toward adopting scalable, secure, and efficient solutions, and will serve as the key advisor on AWS best practices. Key Responsibilities: Cloud Architecture & Strategy Design, develop, and implement sophisticated AWS cloud solutions aligned with business objectives. Establish and drive the cloud roadmap in line with industry best practices and organizational strategy. DevOps Automation & Optimization Architect and refine CI/CD pipelines, automation workflows, and Infrastructure as Code (IaC) solutions using tools such as Terraform, AWS CloudFormation, and Ansible. Implement continuous integration, deployment, and monitoring frameworks to ensure high availability and robust security. Security & Networking Ensure cloud solutions comply with the highest security standards, covering IAM policies, VPC configurations, data encryption, and regulatory compliance. Design and manage network architectures optimized for scalability, performance, and security within AWS environments. Collaboration & Mentorship Collaborate with cross-functional teams to guide cloud adoption strategies and integration. Serve as a subject matter expert on AWS best practices, providing mentorship to team members on cloud and DevOps methodologies. Innovation & Continuous Improvement Stay abreast of industry trends, emerging technologies, and AWS service updates to drive innovation and optimization. Advocate for best practices in security, cloud architecture, and operational efficiency.Required Qualifications Required Qualifications Experience: 10+ years of hands-on experience in AWS cloud architecture, DevOps, security, and networking. Proven track record in designing scalable, secure, and high-performance cloud solutions. Technical Expertise: AWS Services : In-depth knowledge of core AWS services, including EC2, S3, Lambda, VPC, RDS, CloudWatch, and IAM. DevOps Tools : Strong proficiency in CI/CD tools (e.g., Jenkins, GitLab CI/CD), containerization technologies (Docker, Kubernetes), and IaC tools like Terraform or AWS CloudFormation. Security & Networking : Solid understanding of cloud security principles, networking concepts, and compliance requirements. Certifications : AWS Certified Solutions Architect – Professional or equivalent DevOps certification. Preferred Qualifications Experience with multi-cloud environments and cross-platform integrations. Strong leadership, coaching, and mentorship skills with the ability to influence teams effectively. Knowledge of serverless architectures, microservices, and modern cloud-native design patterns. What We Offer Competitive Compensation : Attractive salary and benefits package. Growth Opportunities : Continuous learning and professional development. Collaborative Environment : A chance to make a significant impact on technical delivery and product success. How to Apply Are you passionate about AWS, DevOps, and driving cloud innovation? If so, we’d love to hear from you! Please send your resume along with a cover letter outlining your relevant experience to: 📧 careers@saradysol.com We look forward to exploring the possibility of having you join our team! Show more Show less

Data Warehouse Analyst/Data Engineer

India

7 years

Not disclosed

Remote

Full Time

About the job About the job Job description Job Title: Data Engineer (Power BI & MS BI Stack - SSRS, SSAS, SSMS, SSIS ) Experience Level: 7+ Years (Immediate Joiners are Preferred) Location: Bengaluru Employment Type: Full-Time -Remote Job Description: We are seeking a highly skilled and experienced Data Warehouse Analyst / Data Engineer to support the end-to-end lifecycle of enterprise data platforms, including data warehouses , data lakes , and data marts , ensuring high performance, data integrity, and security. The ideal candidate will have deep expertise in data modeling , ETL processes , metadata management , and business intelligence systems , along with the ability to deliver robust data presentation layers and actionable insights. Experience with Databricks is a strong plus. Key Responsibilities: Design, develop, and maintain enterprise-wide data warehouse platforms , ensuring optimal performance , security , and scalability . Develop and optimize ETL processes using MS SQL Server Integration Services (SSIS) , T-SQL , and other tools. Build and maintain data models , metadata catalogs , and ER diagrams to support analytical needs. Proactively monitor system performance and data integrity; conduct preventive and corrective maintenance . Work with cross-functional teams to interpret data and create meaningful business intelligence reports using tools such as Power BI . Leverage Python , PySpark , and JSON for advanced data processing , transformation , and integration . Create and manage data repositories and presentation layers aggregating information across diverse source systems. Contribute to the development and optimization of cloud-based data platforms . Implement and promote best practices in data architecture , data security , and governance . Work with Databricks to support scalable data processing pipelines and advanced analytics workflows (preferred experience). Qualifications: Bachelor’s degree in Computer Science , Information Technology , or a related field. 5+ years of experience in data warehouse development , ETL , and BI systems . Strong background in data modeling , data architecture , and metadata management . Proficient in SQL , ETL tools , and database performance tuning. Hands-on experience with the MS BI Stack : SSIS , SSRS , SSAS , and SSMS . Proficiency in analytics tools such as Power BI . Experience with cloud data platforms (Azure, AWS, or GCP) is preferred. Experience with Databricks is a strong plus . Proficiency in Python , PySpark , and working with structured/unstructured data . Skills and Competencies: Excellent problem-solving , debugging , and optimization skills. Strong collaboration and communication skills to engage with business and technical teams. Ability to synthesize complex data into meaningful insights and visualizations . Familiarity with data governance , compliance , and security standards . Why Join Us? Opportunity to work with cutting-edge technology and tools. A collaborative and inclusive work culture. Opportunities for professional growth and certification sponsorship. Flexible working hours and hybrid work options. How to Apply: If you’re ready to take the next step in your career, send your updated resume to [careers@saradysol.com] with the subject line: “ Data Analytics Engineer/Data Analyst Role”. 💼 Alternatively, use LinkedIn’s Easy Apply feature to connect with us effortlessly! Be part of a team that leverages technology to achieve excellence in quality assurance. Let’s shape the future together! 🚀 Show more Show less

Lead Data Engineer (Databricks Expert – with MS BI Stack)

India

10 years

None Not disclosed

Remote

Full Time

Job description Job Title: Lead Data Engineer (Databricks Expert – with MS BI Stack Preferred) Experience Level: 10+ Years (with 5+ Years in Databricks) – Immediate Joiners Preferred Location: Bengaluru (Remote) Employment Type: Full-Time – Remote Job Description: We are seeking a Lead Data Engineer with deep hands-on expertise in Databricks to lead the development of scalable data processing pipelines, real-time analytics workflows, and enterprise-level data lakehouses . The ideal candidate will have end-to-end experience in building and optimizing complex data systems on the Databricks platform, and can independently lead technical initiatives, architect solutions, and mentor data teams. While Databricks expertise is mandatory, experience with Microsoft BI Stack (SSIS, SSRS, SSAS, SSMS) will be considered a strong advantage. Key Responsibilities: Lead the design and implementation of scalable, high-performance data pipelines using Databricks (Delta Lake, PySpark, SQL, MLflow). Define and drive data architecture, modeling, and governance strategies. Build and optimize ETL/ELT workflows and automate data transformation processes. Collaborate with analysts and data scientists to support advanced analytics and ML model integration. Ensure cost-effective, reliable, and high-performing data systems in a cloud-native environment. Translate business requirements into technical solutions with reusable, modular designs. Set and enforce best practices in code quality, CI/CD, testing, and observability for Databricks pipelines. Work with MS BI Stack (SSIS, SSRS, SSAS) to support enterprise reporting systems. Required Qualifications: 10+ years of overall experience in data engineering, with 5+ years of strong hands-on Databricks experience. Proven expertise in PySpark, Databricks SQL, and Delta Lake. Deep understanding of data lakehouses, distributed systems, and data warehousing. Strong experience with cloud platforms, preferably Azure. Proficient in Python and processing large structured/unstructured datasets. Track record of leading end-to-end Databricks projects, from ingestion to analytics. Strong experience with CI/CD, Git workflows, job orchestration, and monitoring. Exceptional problem-solving and performance optimization capabilities. Experience with Microsoft BI Stack (SSIS, SSRS, SSAS, SSMS). Familiarity with Power BI or similar data visualization tools. Awareness of data security, compliance, and governance frameworks. Exposure to Agile/Scrum practices and cross-functional team collaboration. Why Join Us? Opportunity to lead high-impact data initiatives using cutting-edge platforms like Databricks. Innovative and fast-paced culture with a focus on learning and growth. Access to certifications, learning resources, and mentoring opportunities. Remote work flexibility with supportive and transparent leadership. How to Apply: If you're a Databricks expert ready to take the lead in driving data engineering excellence, send your resume to 📩 careers@saradysol.com with the subject line: “Lead Data Engineer – Databricks (Remote)” You can also apply via LinkedIn’s Easy Apply feature. Let’s build the future of data together! 🚀

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Job Titles Overview