About the job Job description Job Title: Data Engineer (Power BI & MS BI Stack - SSRS, SSAS, SSMS, SSIS) Experience Level: 5+ Years (Immediate Joiners are Preferred) Location: Bengaluru Employment Type: Full-Time -Remote Job Description: We are seeking a highly skilled and experienced Data Warehouse Analyst / Data Engineer to support the end-to-end lifecycle of enterprise data platforms, including data warehouses , data lakes , and data marts , ensuring high performance, data integrity, and security. The ideal candidate will have deep expertise in data modeling , ETL processes , metadata management , and business intelligence systems , along with the ability to deliver robust data presentation layers and actionable insights. Experience with Databricks is a strong plus. Key Responsibilities: Design, develop, and maintain enterprise-wide data warehouse platforms , ensuring optimal performance , security , and scalability . Develop and optimize ETL processes using MS SQL Server Integration Services (SSIS) , T-SQL , and other tools. Build and maintain data models , metadata catalogs , and ER diagrams to support analytical needs. Proactively monitor system performance and data integrity; conduct preventive and corrective maintenance . Work with cross-functional teams to interpret data and create meaningful business intelligence reports using tools such as Power BI . Leverage Python , PySpark , and JSON for advanced data processing , transformation , and integration . Create and manage data repositories and presentation layers aggregating information across diverse source systems. Contribute to the development and optimization of cloud-based data platforms . Implement and promote best practices in data architecture , data security , and governance . Work with Databricks to support scalable data processing pipelines and advanced analytics workflows (preferred experience). Qualifications: Bachelor’s degree in Computer Science , Information Technology , or a related field. 5+ years of experience in data warehouse development , ETL , and BI systems . Strong background in data modeling , data architecture , and metadata management . Proficient in SQL , ETL tools , and database performance tuning. Hands-on experience with the MS BI Stack : SSIS , SSRS , SSAS , and SSMS . Proficiency in analytics tools such as Power BI . Experience with cloud data platforms (Azure, AWS, or GCP) is preferred. Experience with Databricks is a strong plus . Proficiency in Python , PySpark , and working with structured/unstructured data . Skills and Competencies: Excellent problem-solving , debugging , and optimization skills. Strong collaboration and communication skills to engage with business and technical teams. Ability to synthesize complex data into meaningful insights and visualizations . Familiarity with data governance , compliance , and security standards . Why Join Us? Opportunity to work with cutting-edge technology and tools. A collaborative and inclusive work culture. Opportunities for professional growth and certification sponsorship. Flexible working hours and hybrid work options. How to Apply: If you’re ready to take the next step in your career, send your updated resume to [careers@saradysol.com] with the subject line: “ Data Analytics Engineer/Data Analyst Role”. 💼 Alternatively, use LinkedIn’s Easy Apply feature to connect with us effortlessly! Be part of a team that leverages technology to achieve excellence in quality assurance. Let’s shape the future together! 🚀 Show more Show less
Job Title: Senior DevOps Engineer / AWS Solutions Architect Location: Remote Job Type: Full-Time / Contract Experience : 10+ years in AWS Cloud Services, DevOps, Cloud Strategy, Security, and Networking Education : Bachelor’s degree in Computer Science, Engineering, or a related field preferred (or equivalent professional experience) Certifications : AWS Certified Solutions Architect – Professional (or equivalent DevOps certification) Job Overview We are seeking an experienced Senior DevOps Engineer or AWS Solutions Architect to lead the development, optimization, and implementation of cloud-based solutions on AWS. The ideal candidate will possess over a decade of expertise in cloud architecture, security, networking, and DevOps, with a focus on driving organizational cloud strategies. This role is critical in guiding our teams toward adopting scalable, secure, and efficient solutions, and will serve as the key advisor on AWS best practices. Key Responsibilities: Cloud Architecture & Strategy Design, develop, and implement sophisticated AWS cloud solutions aligned with business objectives. Establish and drive the cloud roadmap in line with industry best practices and organizational strategy. DevOps Automation & Optimization Architect and refine CI/CD pipelines, automation workflows, and Infrastructure as Code (IaC) solutions using tools such as Terraform, AWS CloudFormation, and Ansible. Implement continuous integration, deployment, and monitoring frameworks to ensure high availability and robust security. Security & Networking Ensure cloud solutions comply with the highest security standards, covering IAM policies, VPC configurations, data encryption, and regulatory compliance. Design and manage network architectures optimized for scalability, performance, and security within AWS environments. Collaboration & Mentorship Collaborate with cross-functional teams to guide cloud adoption strategies and integration. Serve as a subject matter expert on AWS best practices, providing mentorship to team members on cloud and DevOps methodologies. Innovation & Continuous Improvement Stay abreast of industry trends, emerging technologies, and AWS service updates to drive innovation and optimization. Advocate for best practices in security, cloud architecture, and operational efficiency.Required Qualifications Required Qualifications Experience: 10+ years of hands-on experience in AWS cloud architecture, DevOps, security, and networking. Proven track record in designing scalable, secure, and high-performance cloud solutions. Technical Expertise: AWS Services : In-depth knowledge of core AWS services, including EC2, S3, Lambda, VPC, RDS, CloudWatch, and IAM. DevOps Tools : Strong proficiency in CI/CD tools (e.g., Jenkins, GitLab CI/CD), containerization technologies (Docker, Kubernetes), and IaC tools like Terraform or AWS CloudFormation. Security & Networking : Solid understanding of cloud security principles, networking concepts, and compliance requirements. Certifications : AWS Certified Solutions Architect – Professional or equivalent DevOps certification. Preferred Qualifications Experience with multi-cloud environments and cross-platform integrations. Strong leadership, coaching, and mentorship skills with the ability to influence teams effectively. Knowledge of serverless architectures, microservices, and modern cloud-native design patterns. What We Offer Competitive Compensation : Attractive salary and benefits package. Growth Opportunities : Continuous learning and professional development. Collaborative Environment : A chance to make a significant impact on technical delivery and product success. How to Apply Are you passionate about AWS, DevOps, and driving cloud innovation? If so, we’d love to hear from you! Please send your resume along with a cover letter outlining your relevant experience to: 📧 careers@saradysol.com We look forward to exploring the possibility of having you join our team! Show more Show less
About the job About the job Job description Job Title: Data Engineer (Power BI & MS BI Stack - SSRS, SSAS, SSMS, SSIS ) Experience Level: 7+ Years (Immediate Joiners are Preferred) Location: Bengaluru Employment Type: Full-Time -Remote Job Description: We are seeking a highly skilled and experienced Data Warehouse Analyst / Data Engineer to support the end-to-end lifecycle of enterprise data platforms, including data warehouses , data lakes , and data marts , ensuring high performance, data integrity, and security. The ideal candidate will have deep expertise in data modeling , ETL processes , metadata management , and business intelligence systems , along with the ability to deliver robust data presentation layers and actionable insights. Experience with Databricks is a strong plus. Key Responsibilities: Design, develop, and maintain enterprise-wide data warehouse platforms , ensuring optimal performance , security , and scalability . Develop and optimize ETL processes using MS SQL Server Integration Services (SSIS) , T-SQL , and other tools. Build and maintain data models , metadata catalogs , and ER diagrams to support analytical needs. Proactively monitor system performance and data integrity; conduct preventive and corrective maintenance . Work with cross-functional teams to interpret data and create meaningful business intelligence reports using tools such as Power BI . Leverage Python , PySpark , and JSON for advanced data processing , transformation , and integration . Create and manage data repositories and presentation layers aggregating information across diverse source systems. Contribute to the development and optimization of cloud-based data platforms . Implement and promote best practices in data architecture , data security , and governance . Work with Databricks to support scalable data processing pipelines and advanced analytics workflows (preferred experience). Qualifications: Bachelor’s degree in Computer Science , Information Technology , or a related field. 5+ years of experience in data warehouse development , ETL , and BI systems . Strong background in data modeling , data architecture , and metadata management . Proficient in SQL , ETL tools , and database performance tuning. Hands-on experience with the MS BI Stack : SSIS , SSRS , SSAS , and SSMS . Proficiency in analytics tools such as Power BI . Experience with cloud data platforms (Azure, AWS, or GCP) is preferred. Experience with Databricks is a strong plus . Proficiency in Python , PySpark , and working with structured/unstructured data . Skills and Competencies: Excellent problem-solving , debugging , and optimization skills. Strong collaboration and communication skills to engage with business and technical teams. Ability to synthesize complex data into meaningful insights and visualizations . Familiarity with data governance , compliance , and security standards . Why Join Us? Opportunity to work with cutting-edge technology and tools. A collaborative and inclusive work culture. Opportunities for professional growth and certification sponsorship. Flexible working hours and hybrid work options. How to Apply: If you’re ready to take the next step in your career, send your updated resume to [careers@saradysol.com] with the subject line: “ Data Analytics Engineer/Data Analyst Role”. 💼 Alternatively, use LinkedIn’s Easy Apply feature to connect with us effortlessly! Be part of a team that leverages technology to achieve excellence in quality assurance. Let’s shape the future together! 🚀 Show more Show less
Job description Job Title: Lead Data Engineer (Databricks Expert – with MS BI Stack Preferred) Experience Level: 10+ Years (with 5+ Years in Databricks) – Immediate Joiners Preferred Location: Bengaluru (Remote) Employment Type: Full-Time – Remote Job Description: We are seeking a Lead Data Engineer with deep hands-on expertise in Databricks to lead the development of scalable data processing pipelines, real-time analytics workflows, and enterprise-level data lakehouses . The ideal candidate will have end-to-end experience in building and optimizing complex data systems on the Databricks platform, and can independently lead technical initiatives, architect solutions, and mentor data teams. While Databricks expertise is mandatory, experience with Microsoft BI Stack (SSIS, SSRS, SSAS, SSMS) will be considered a strong advantage. Key Responsibilities: Lead the design and implementation of scalable, high-performance data pipelines using Databricks (Delta Lake, PySpark, SQL, MLflow). Define and drive data architecture, modeling, and governance strategies. Build and optimize ETL/ELT workflows and automate data transformation processes. Collaborate with analysts and data scientists to support advanced analytics and ML model integration. Ensure cost-effective, reliable, and high-performing data systems in a cloud-native environment. Translate business requirements into technical solutions with reusable, modular designs. Set and enforce best practices in code quality, CI/CD, testing, and observability for Databricks pipelines. Work with MS BI Stack (SSIS, SSRS, SSAS) to support enterprise reporting systems. Required Qualifications: 10+ years of overall experience in data engineering, with 5+ years of strong hands-on Databricks experience. Proven expertise in PySpark, Databricks SQL, and Delta Lake. Deep understanding of data lakehouses, distributed systems, and data warehousing. Strong experience with cloud platforms, preferably Azure. Proficient in Python and processing large structured/unstructured datasets. Track record of leading end-to-end Databricks projects, from ingestion to analytics. Strong experience with CI/CD, Git workflows, job orchestration, and monitoring. Exceptional problem-solving and performance optimization capabilities. Experience with Microsoft BI Stack (SSIS, SSRS, SSAS, SSMS). Familiarity with Power BI or similar data visualization tools. Awareness of data security, compliance, and governance frameworks. Exposure to Agile/Scrum practices and cross-functional team collaboration. Why Join Us? Opportunity to lead high-impact data initiatives using cutting-edge platforms like Databricks. Innovative and fast-paced culture with a focus on learning and growth. Access to certifications, learning resources, and mentoring opportunities. Remote work flexibility with supportive and transparent leadership. How to Apply: If you're a Databricks expert ready to take the lead in driving data engineering excellence, send your resume to 📩 careers@saradysol.com with the subject line: “Lead Data Engineer – Databricks (Remote)” You can also apply via LinkedIn’s Easy Apply feature. Let’s build the future of data together! 🚀
About Job Job Title: Freelance Technical Interviewer (Hourly Pay) Company: Saradysol Tech Ventures Private Limited Location: Remote Type: Hourly Engagement Job Summary We are seeking a highly talented and versatile Technical Interviewer who can conduct professional technical interviews across a wide range of technology roles. The ideal candidate will have excellent communication skills, strong technical depth across multiple domains, and professional presentation skills on camera. This role is hourly-based, offering flexibility while ensuring you help us evaluate and select top technical talent. Responsibilities Conduct technical interviews for candidates across multiple technology domains (Java, .NET, Python, DevOps, Cloud, Data Engineering, QA Automation, etc.). Evaluate candidates’ technical expertise, problem-solving ability, and communication skills. Provide detailed feedback and structured evaluation reports after each interview. Represent the company professionally with excellent communication and confident on-camera presence. Collaborate with the hiring team to refine interview processes and ensure a high-quality experience for candidates. Requirements 10+ years of IT experience with strong knowledge across multiple technologies (Programming, Cloud, DevOps, Data, QA, Architecture, etc.). Prior experience in conducting technical interviews is highly desirable . Excellent verbal and written communication skills. Strong analytical and problem-solving abilities. Professional appearance and presentation in video interviews. Ability to evaluate both technical depth and soft skills. Compensation Hourly Pay – Competitive rates, based on expertise and experience. Why Join Us? Flexible remote opportunity. Be part of a growing tech consultancy. Play a critical role in shaping our technical hiring process. How to Apply: Send your updated resume to [careers@saradysol.com] with the subject line: “Part-Time Jira Administrator – Contractor Role” or connect with us via LinkedIn’s Easy Apply option. Let’s build intelligent data solutions together! 🚀
About the job Job Title: Jira Administrator ( Part-Time ) Location: Remote (preferred, with ability to adjust to US time zones for syncs) Commitment: 20 hours/week (Contract) Experience Level: 10+ Years Job Summary: We are seeking an experienced Jira Administrator to provide part-time consulting support. The ideal candidate will have a strong background in configuring, customizing, and maintaining Jira for enterprise teams, as well as integrating Jira with Bitbucket for seamless DevOps visibility . This is a consulting engagement, not a full-time role, and offers flexible hours. Responsibilities • Administer and configure J ira projects, workflows, fields, screens, and permissions. • Build custom dashboards and advanced reports using JQL and gadgets for leadership, pods, and cross-team visibility. • Set up and manage pods/team structures in Jira, ensuring each has the right boards, workflows, and permissions. • Integrate Jira with Bitbucket for commit tracking, pull request linking, and DevOps pipeline visibility. • Automate r epetitive tasks and workflows using Jira Automation or scripting tools. • Provide guidance on Jira best practices, governance, and user training where needed. • Troubleshoot Jira-related issues and support continuous improvement. Qualifications • 10+ years of hands-on experience as a Jira Administrator. • Proven expertise in building dashboards and reports tailored to executive and team needs. • Strong understanding of Jira project configuration: workflows, schemes, issue types, permissions. • Experience integrating Jira with Bitbucket (and optionally with Confluence). • Familiarity with automation (Jira Automation, ScriptRunner, or similar). • Excellent problem-solving and communication skills. • Prior consulting experience is a plus. How to Apply: Send your updated resume to [careers@saradysol.com] with the subject line: “Part-Time Jira Administrator – Contractor Role” or connect with us via LinkedIn’s Easy Apply option. Let’s build intelligent data solutions together! 🚀