Seeking highly skilled and experienced Flutter Developers to join our team. The ideal candidates will have strong expertise in developing cross-platform mobile applications using Flutter, with deep understanding of Android and iOS ecosystems. You will be responsible for delivering high-quality, scalable, and maintainable mobile applications that meet business objectives. Key Responsibilities Design and develop advanced cross-platform mobile applications using Flutter for Android and iOS. Build and customize UI components using Flutter widgets. Integrate with RESTful APIs and third-party packages. Ensure performance, quality, and responsiveness of applications. Collaborate with UI/UX designers, product managers, and backend developers to deliver intuitive and functional user experiences. Implement state management solutions such as Provider, Riverpod, Bloc, etc. Ensure responsive design and smooth functionality across different screen sizes and device types. Implement localization and internationalization to support multiple languages. Set up and manage CI/CD pipelines for mobile app development. Configure and implement push notification services. Conduct code reviews, write unit and integration tests, and optimize performance. Required Skills & Qualifications 3+ years of experience in mobile app development, with at least 2 years in Flutter. Strong knowledge of Flutter SDK. Solid understanding of native mobile app development for Android and iOS. Experience with widget customization and UI rendering. Proficient in integrating and consuming RESTful APIs. Hands-on experience with state management libraries like Provider, Bloc, Riverpod, etc. Familiarity with package management using pub.dev. Experience with setting up CI/CD pipelines using tools like Azure Dev Ops, GitHub Actions, Jenkin, or similar. Knowledge of responsive design principles and adaptive layouts. Experience in implementing localization and internationalization. Proficient in handling push notifications across Android and iOS platforms. Strong problem-solving and debugging skills. Preferred Qualifications Experience in publishing mobile apps to the App Store and Google Play. Familiarity with Agile methodologies and tools like JIRA Understanding of app security, permissions, and performance optimization. Experience in version control systems (Git).
Technical Project Manager Syren is looking for multiple Technical Project Managers for its rapidly growing engineering teams. This is a great opportunity to help design, architect, and build well-built applications for our Fortune 50 clients and more. We are looking for a strong and passionate engineering leader to join our team as we scale the business over the next few years. We are looking for highly motivated engineers with the ability to work in a fast-paced environment, cloud services experience, strong collaboration skills, and enthusiasm to work on various engineering efforts at Syren. Join Syren s fast-growing set of excellent engineering teams. Job Summary The Technical Project Manager will oversee the end-to-end execution of data projects, ensuring alignment with business objectives, timely delivery, and high-quality outcomes. This role requires a blend of technical expertise in big data technologies, project management proficiency, and strong leadership to coordinate cross-functional teams, including data engineers, UI engineers, and business stakeholders. Key Responsibilities Project Planning and Execution : Define project scope, objectives, and deliverables in collaboration with stakeholders. Develop detailed project plans, including timelines, milestones, and resource allocation. Manage project execution, ensuring adherence to scope, budget, and schedule. Identify and mitigate risks, resolving issues that may impact project success. Technical Leadership : Collaborate with data engineers to design scalable and efficient data pipelines. Ensure data solutions meet performance, security, and compliance requirements. Stakeholder Management : Act as the primary point of contact for stakeholders, including business leaders, IT teams, and external vendors. Translate business requirements into technical specifications and communicate progress effectively. Facilitate regular status meetings, providing updates on project health, risks, and outcomes. Team Coordination : Lead cross-functional teams, including data engineers, analysts, and UI Engineers. Foster collaboration and ensure clear communication across all team members. Manage resource allocation and resolve conflicts to maintain team productivity. Quality Assurance and Delivery : Oversee testing and validation of solutions to ensure accuracy and reliability. Ensure deliverables meet quality standards and align with business expectations. Drive post-project evaluations to capture lessons learned and improve future processes. Qualifications Education : Bachelor s degree in Computer Science, Information Technology, Data Science, or a related field. Master s degree or MBA is a plus. Experience : 5+ years of experience in project management, with at least 3 years managing big data or analytics projects. Proven track record of delivering complex, large-scale data projects on time and within budget. Hands-on experience with big data technologies (e.g., Spark,Databricks, etc.). Experience working with cloud platforms (e.g., AWS, Azure, Google Cloud) for big data solutions. Familiarity with Agile and Scrum methodologies; PMP, Agile, or similar certifications are highly desirable. Technical Skills : Strong understanding of big data ecosystems, including data ingestion, storage, processing, and analytics. Knowledge of programming languages (e.g., Python, Scala, or Java) and SQL for data querying. Familiarity with ETL/ELT processes, data lakes, and data warehousing concepts. Experience with DevOps tools (e.g., Jenkins, Docker, Kubernetes) for CI/CD pipelines is a plus. Soft Skills : Excellent leadership and team management skills, with the ability to motivate and guide diverse teams. Strong communication and interpersonal skills to engage with technical and non-technical stakeholders. Exceptional problem-solving and decision-making abilities. Ability to manage multiple priorities in a fast-paced environment. Certifications (Preferred): PMP (Project Management Professional), CAPM, or Agile/Scrum certifications. Certifications in cloud platforms (e.g., AWS Certified Big Data, Google Cloud Professional Data Engineer). About Syren Syren Cloud is a start-up with a mission to help clients transform data and engineering operations for the better. We provide technology services to Fortune 50 clients, including the likes of Microsoft, GitHub, Johnson & Johnson, and UiPath among many others. Over the past year, we have grown to become a 150 strong company and intend to reach over 400 by the end of 2022. Syren believes in a work environment that enables people to do their best at delivering exceptional services to our customers. If you are looking to make an impact at helping our customers in delivering great experiences to their customers Syren is right for you. Working at Syren Syren provides the above-industry salary, learning opportunities, a great work environment, and the chance of being in a fast-growing space. Syren is an equal opportunity employer. We welcome and encourage diversity in the workplace regardless of race, gender, religion, age, sexual orientation, gender identity, disability, or veteran status. At Syren, we strive to bring together diverse teams, with world-class experience and different talents.
Company Overview: SyrenCloud Inc. is a leading Data Engineering company that specializes in solving complex challenges in the Supply Chain Management industry. With a growing team of 350+ professionals and a solid revenue of $25M+ , our mission is to empower organizations with cutting-edge software engineering solutions that optimize operations, harness supply chain intelligence, and drive sustainable growth. We prioritize both professional growth and employee well-being, maintaining a positive work culture while offering opportunities for continuous learning and advancement. Role Overview: We are seeking a highly motivated and experienced Scrum Master to join our agile delivery team. The ideal candidate will play a pivotal role in enabling Agile best practices, removing impediments, and supporting cross-functional teams to deliver high-quality software solutions especially in data engineering projects on Azure cloud . Key Responsibilities: Facilitation and Coaching - Guide and coach the team on Agile/Scrum principles and practices - Facilitate core Scrum ceremonies: daily stand-ups, sprint planning, reviews, and retrospectives - Foster a healthy Agile mindset and support team ownership Impediment Removal Team Support - Identify and eliminate blockers affecting sprint progress - Collaborate with the Product Owner to refine and maintain a well-defined backlog - Enable smooth sprint execution and help prioritize tasks Collaboration Communication - Promote open, transparent communication within the team and across departments - Encourage stakeholder engagement and cross-team alignment - Shield the team from unnecessary external interruptions Continuous Improvement - Drive team retrospectives and implement actionable improvements - Monitor Agile metrics to measure progress and optimize team performance - Advocate for a culture of experimentation and growth Education Agile Advocacy - Champion Agile and Scrum across the organization - Support organizational Agile transformation initiatives - Stay updated with latest industry practices and Agile frameworks Required Qualifications: - Proven experience as a Scrum Master in Agile software development environments - Experience working on data engineering projects , particularly within Azure Cloud ecosystems - Strong understanding of Agile principles, frameworks, and ceremonies - Excellent facilitation, conflict resolution, and servant leadership skills - Ability to communicate effectively with both technical and non-technical stakeholders Preferred Certifications: - Certified Scrum Master (CSM) or Professional Scrum Master (PSM) - Additional Agile certifications such as SAFe , PMI-ACP , or ICP-ACC are a plus
We are seeking a hands-on and detail-oriented Data Support Engineer to support, monitor, and maintain our data platforms and ETL pipelines. This role includes managing production data operations, troubleshooting technical issues, and supporting infrastructure-level tasks such as scaling applications, modifying cluster configurations, and optimizing resources for performance and cost. Responsibilities: Monitor, manage, and maintain data pipelines to ensure reliable and timely data processing. Investigate and resolve production issues and data anomalies in a timely manner, ensuring minimal business disruption. Triage and resolve support tickets related to data functionality, job failures, performance issues, and availability. Troubleshoot technical problems and respond to customer or internal team queries within defined SLA timelines. Perform root cause analysis for incidents and recommend/implement long-term solutions and preventive measures. Scale up/down infrastructure components such as Databricks clusters and Azure Data Factory integration runtimes based on pipeline needs. Modify and manage configuration settings for performance tuning, cost optimization, and capacity planning. Implement and execute data imports/exports for internal users or customers as needed. Maintain and update support documentation, standard operating procedures (SOPs), and knowledge base articles. Detect and resolve potential issues proactively before they impact the platform or downstream users. Collaborate with engineering teams on infrastructure and platform improvements to enhance data reliability and scalability. Requirements & Skills: At least 3 years of experience in ETL development, production monitoring, or technical support roles. Practical knowledge of Databricks, Azure Data Factory (ADF), Python, and SQL. Experience working with cloud infrastructure related to data platforms, especially in Azure. Familiarity with scaling and configuration management of cloud-based compute resources (e.g., Databricks clusters, ADF IR). Proficiency in scripting languages (e.g., Python, Bash) for automation and data handling. Understanding of database systems, data modeling, and data validation techniques. Experience with data integration and ETL workflows in production environments. Knowledge of data backup, recovery, and incident management processes. Strong analytical and problem-solving skills for diagnosing complex technical issues. High attention to detail and a commitment to data quality and operational excellence. Excellent verbal and written communication skills, with a collaborative approach to teamwork. Preferred Qualifications: Familiarity with monitoring and alerting tools (e.g., Azure Monitor, Log Analytics). Understanding of CI/CD practices, version control (Git), and DevOps collaboration. Exposure to data governance, compliance, and data security best practices.
Job Title: Data Scientist Location: Hyderabad (Hybrid, Remote) Job Type: Full-time Experience Level: 3-6 Years Responsibilities: Data Analysis and Modelling: Utilize statistical and machine learning techniques to analyse large datasets and derive actionable insights. Develop predictive models and algorithms to address business challenges and improve decision-making processes. Feature Engineering and Data Preprocessing: Work with raw data to perform data cleaning, transformation, and feature engineering to prepare datasets for analysis. Collaborate with cross-functional teams to gather and understand data requirements. Model Deployment and Monitoring: Implement and deploy machine learning models into production environments. Monitor model performance and make necessary adjustments to ensure continued accuracy and relevance. Collaboration and Communication: Collaborate with data engineers, analysts, and business stakeholders to understand their requirements and provide data-driven solutions. Communicate complex findings and insights in a clear and understandable manner to both technical and non-technical stakeholders. Continuous Learning and Research: Stay abreast of industry trends and advancements in data science and machine learning. Continuously enhance skills through training and self-directed learning. Requirements: Education and Experience: Bachelors or Masters degree in Computer Science, Statistics, or a related field. years of proven experience as a Data Scientist. Technical Skills: Proficient in programming languages such as Python or R. Experience with machine learning frameworks (e.g., TensorFlow, PyTorch, scikit-learn). Strong knowledge of statistical analysis and data visualization tools. Problem-Solving: Ability to translate business problems into analytical frameworks and develop data-driven solutions. Communication Skills: Strong interpersonal and communication skills with the ability to explain complex concepts to non-technical stakeholders. Team Collaboration: Demonstrated ability to work effectively in a collaborative team environment.
Detailed Job Description: We are seeking a highly skilled Data Engineer with strong experience in Azure Data Services, SAP integration, and modern data governance tools like Unity Catalog. The ideal candidate will have a deep understanding of data extraction from enterprise systems, data movement across platforms, and performance-optimized data pipeline development. Implement and manage Unity Catalog within the data ecosystem for data governance, security, and access control. Integrate with SAP systems , including connecting to key master data tables and leveraging SAP APIs for data extraction. Develop and maintain data ingestion pipelines using APIs for real-time and batch data extraction. Work with Azure Blob Storage and Azure Synapse Analytics for staging and transformation of large datasets. Design and implement data movement strategies from Synapse to Microsoft Fabric , leveraging native and custom pipelines. Handle and transform multiple data formats including JSON, XML, and CSV . Build and maintain Azure Data Factory (ADF) pipelines , including setup of Linked Services, Datasets, and Integration Runtime configurations. Optimize pipeline performance through tuning, parallelization, and resource management. Implement robust monitoring and logging mechanisms for data pipelines to ensure reliability, traceability, and operational visibility. Skills Required: Extensive experience in data migration , including SAP to Data Lake , SQL databases , and API-based integrations . Proficient in handling diverse data formats : JSON, CSV, Parquet, XML. Strong hands-on expertise with Azure services : ADLS for scalable storage ADF for ETL/ELT pipelines Synapse for analytics and reporting Strong hands-on expertise with Databricks, Unity Catalog Strong knowledge on Synapse to Fabric data lake and pipelines
This role involves coordinating cross-functional teams, managing timelines and resources, and ensuring project goals align with the organization s objectives. The ideal candidate is an excellent communicator, highly organized, and skilled at stakeholder management. Key Responsibilities Project Planning : Define project scope, objectives, and deliverables in collaboration with stakeholders, ensuring alignment with organizational goals. Resource Management : Allocate resources and coordinate team members to ensure efficient project execution. Timeline Management : Develop and maintain project schedules, track milestones, and ensure timely completion of deliverables. Stakeholder Communication : Act as the primary point of contact for stakeholders, providing regular updates on project progress, risks, and outcomes. Risk Management : Identify potential risks and develop mitigation strategies to keep projects on track. Team Coordination : Facilitate collaboration among cross-functional teams, ensuring clarity of roles and responsibilities. Documentation : Maintain comprehensive project documentation, including plans, reports, and status updates. Quality Assurance : Ensure deliverables meet quality standards and stakeholder expectations. Change Management : Manage scope changes and communicate impacts to stakeholders effectively. Post-Project Evaluation : Conduct project reviews to assess outcomes, gather feedback, and identify lessons learned for future projects. Qualifications Education : Bachelor s degree in business administration, management, Communications, or a related field. A Master s degree or PMP certification is a plus. Experience : 2-5 years of project management experience in non-technical fields such as marketing, operations, event planning, or business administration. Skills : Strong organizational and time-management skills. Excellent verbal and written communication skills. Proficiency in project management tools (e.g., Trello, Asana, Microsoft Project, or similar). Ability to manage multiple projects simultaneously in a fast-paced environment. Strong problem-solving and decision-making abilities. Experience with budget management and resource allocation. Familiarity with risk management and change management processes. Personal Attributes : Proactive, detail-oriented, and adaptable. Strong interpersonal skills with the ability to build relationships and influence stakeholders. Ability to work independently and as part of a team. Preferred Qualifications Certification in project management (e.g., PMP, CAPM, PRINCE2, or Agile). Experience in [specific industry, e.g., marketing, consulting, event management, etc.]. Familiarity with data analysis or reporting tools (e.g., Excel, Google Analytics) for tracking project metrics.
Company Overview: Syren Cloud Inc. is a leading Data Engineering company that specializes in solving complex challenges in the Supply Chain Management industry. We have a team of over 350 employees and a robust revenue of $25M+. Our mission is to empower organizations with cutting-edge software engineering solutions that optimize operations, harness supply chain intelligence, and drive sustainable growth. We value both growth and employee well-being, striving to maintain a positive work environment while providing opportunities for professional development. Role Summary: As a Business Intelligence Engineer, you ll create and deliver data analytics solutions using tools like Power BI. You ll collaborate with business partners to define requirements, design data models, and build dashboards to provide actionable insights. Your work will help automate the digital supply chain for a global audience. Key Responsibilities - Develop dashboards and reports using Power BI. - Design data models to transform raw data into insights. - Work with MS SQL Server BI stack (SSRS, TSQL, Power Query, MDX, DAX). - Collaborate with end users to gather and translate requirements. - Enhance existing BI systems and ensure data security. Qualifications - 4+ years of Power BI experience. - Proficiency in Power BI, SQL Server, TSQL, Power Query, MDX, and DAX. - Strong analytical and communication skills. - Knowledge of database management systems and OLAP/OLTP. - Comfortable working under deadlines in agile environments.
As a Full Stack Developer, you will be responsible for developing and maintaining both front-end and back-end components of web applications. You will utilize the .NET framework and related technologies for server-side development while leveraging React.js to build interactive and responsive user interfaces on the client-side. Your role will involve building and maintaining RESTful APIs to facilitate communication between front-end and back-end systems, as well as implementing authentication, authorization, and data validation mechanisms within APIs. In terms of Database Management, you will design, implement, and manage databases using technologies such as SQL Server or Azure SQL Database. Your responsibilities will include ensuring efficient data storage, retrieval, and manipulation to support application functionality. You will also be involved in Data Pipeline Management, where you will design, implement, and manage data pipelines using technologies such as PySpark, Python, and SQL. Building and maintaining pipelines in Databricks will be part of your tasks. Cloud Services Integration will be a key aspect of your role, requiring you to utilize Azure services for hosting, scaling, and managing web applications. You will implement cloud-based solutions for storage, caching, and data processing, as well as configure and manage Azure resources such as virtual machines, databases, and application services. In terms of DevOps and Deployment, you will implement CI/CD pipelines for automated build, test, and deployment processes using Jenkins. It will be essential to ensure robust monitoring, logging, and error handling mechanisms are in place. Documentation and Collaboration are important aspects of this role, where you will document technical designs, implementation details, and operational procedures. Collaborating with product managers, designers, and other stakeholders to understand requirements and deliver high-quality solutions will be part of your responsibilities. Continuous Learning is encouraged in this role, requiring you to stay updated with the latest technologies, tools, and best practices in web development and cloud computing. You will continuously improve your skills and knowledge through self-learning, training, and participation in technical communities. Requirements for this role include a Bachelor's Degree or equivalent experience, along with 5+ years of software engineering experience in reliable and resilient Microservice development and deployment. Strong knowledge of RESTful API, React.js, Azure, Python, PySpark, Databricks, Typescript, Node.js, relational databases like SQL Server, and No-SQL data store such as Redis and ADLS is essential. Experience with Data Engineering, Jenkins, Artifactory, and Automation testing frameworks is desirable. Prior experience with Agile, CI/CD, Docker, Kubernetes, Kafka, Terraform, or similar technologies is also beneficial. A passion for learning and disseminating new knowledge is highly valued in this role.,
Job Title: ML Ops Location: Hyderabad (Hybrid, Remote) Job Type: Full-time Experience Level: 3-4 Years Detailed Job Description: We are seeking an ML Ops Engineer to operationalize and manage the lifecycle of machine learning models, especially those built using Azure and Databricks. You will be key in deploying, monitoring, and optimizing models across multiple business use cases. Responsibilities: Research, design, implement, and deploy Machine Learning algorithms for enterprise scale applications. Deployed & optimized the runtime of Deep Learning Model on Azure Cloud. Expose the Model endpoint using API gateway and monitor the performance continuously. Engage with clients to translate business needs into technical requirements and Machine Learning solutions. Contribute to the design and implementation of features for new and existing enterprise AI solution offerings. Provide ongoing support and monitoring for solutions running in production. Continuously research and stay abreast of the latest advancements in machine learning and AI to apply cutting-edge techniques in our solutions. Qualifications: Education and Experience: Bachelors or masters degree in computer science, Statistics, or a related field. Years of proven experience as a ML Ops. Technical Skills: Strong proficiency in Python , DSA, and SQL. Familiarity with ML model lifecycle including key considerations and relevant tools (MLFlow, Databricks ML, Kubeflow, etc.) for supporting and governing models at scale. Developing & Deploying the Web API for Model Endpoint and API Integration across multiple services. Strong expertise in connecting to Database like Postgres, SQL server, Azure Databricks Catalogs. Prior exposure to cloud computing services like Azure Cloud is advantageous (especially ML and AI toolkits). Understanding of Model Evaluation metrics, Data Drift and Model Drift detections. Hands-on experience on handling Large Dataset using Azure Databricks and Azure Cloud is must. Problem-Solving: Ability to translate business problems into analytical frameworks and develop data-driven solutions. Communication Skills: Strong interpersonal and communication skills with the ability to explain complex concepts to non-technical stakeholders. Team Collaboration: Demonstrated ability to work effectively in a collaborative team environment.
Data Analyst Must have worked on Supply Chain / Production / Factories related data About the Role We are looking for Data Analyst to join our team! In this position, you will be in charge of developing and updating BI reports as well as communicating actionable insights to improve business decision-making. You should be able to manage data, evaluate its results, and display them strategically by using visualization tools, like PowerBI, DAX queries, charts, and relationships. This developers position necessitates extensive understanding of modeling, databases, data warehousing, data integration, and technical elements of business intelligence technologies. You should also have a strong technical understanding of business trends. Strong communication, organizational, and analytical abilities are required. Key Responsibilities Work very closely with Supply Chain team leaders and develop automated reporting for all business requirements Formulate automated reports and dashboards using Power BI and other reporting tools. Understand business requirements to set functional specifications for reporting applications. You should be familiar with Power Query, PowerBI, and DAX are just a few of the tools and other BI tools Should have very good working knowledge on MS Excel and PPT Excellent with MS SQL Server BI Stack, MS Azure, should have ability to write complex SQL queries. Exhibit a foundational understanding of database concepts such relational database architecture, multidimensional database design, and more Design data models that transform raw data into insightful knowledge by understanding business requirements in the context of BI. Develop technical specifications from business needs and choose a deadline for work completion. Make charts and data documentation that includes descriptions of the techniques, parameters, models, and relationships. Establish row-level security on data and comprehend Power BIs application security layer models. Examine, comprehend, and study business needs as they relate to business intelligence. Create dynamic and eye-catching dashboards and reports using Power BI. Connect to multiple data sources and integrate alter and curate data for business intelligence. Preferred Skills / Nice to haves Extremely good communication skills are necessary to effectively explain the requirements between both internal teams and client teams. Exceptional analytical thinking skills for converting data into illuminating reports and reports. BS in computer science or information system along with work experience in a related field knowledge of data warehousing, data gateway, and data preparation projects Articulating, representing, and analyzing solutions with the team while documenting, creating, and modeling them Working knowledge with the tools and technologies, the Microsoft SQL Server, SQL DB and Language, Azure Analysis Services, Power Query, MDX, DAX, PowerBI (Desktop and Service). Detailed knowledge and understanding of database management systems, OLAP, and the ETL (Extract, Transform, Load) framework Comprehensive understanding of data modeling, administration, and visualization Capacity to perform in an atmosphere where agility and continual development are prioritized
We are seeking a highly motivated and experienced Scrum Master to join our agile delivery team. The ideal candidate will play a pivotal role in enabling Agile best practices, removing impediments, and supporting cross-functional teams to deliver high-quality software solutions especially in data engineering projects on Azure cloud . Key Responsibilities: Facilitation and Coaching Guide and coach the team on Agile/Scrum principles and practices Facilitate core Scrum ceremonies: daily stand-ups, sprint planning, reviews, and retrospectives Foster a healthy Agile mindset and support team ownership Impediment Removal & Team Support Identify and eliminate blockers affecting sprint progress Collaborate with the Product Owner to refine and maintain a well-defined backlog Enable smooth sprint execution and help prioritize tasks Collaboration & Communication Promote open, transparent communication within the team and across departments Encourage stakeholder engagement and cross-team alignment Shield the team from unnecessary external interruptions Continuous Improvement Drive team retrospectives and implement actionable improvements Monitor Agile metrics to measure progress and optimize team performance Advocate for a culture of experimentation and growth Education & Agile Advocacy Champion Agile and Scrum across the organization Support organizational Agile transformation initiatives Stay updated with latest industry practices and Agile frameworks Required Qualifications: Proven experience as a Scrum Master in Agile software development environments Experience working on data engineering projects , particularly within Azure Cloud ecosystems Strong understanding of Agile principles, frameworks, and ceremonies Excellent facilitation, conflict resolution, and servant leadership skills Ability to communicate effectively with both technical and non-technical stakeholders Preferred Certifications: Certified Scrum Master (CSM) or Professional Scrum Master (PSM) Additional Agile certifications such as SAFe , PMI-ACP , or ICP-ACC are a plus
Company Overview: SyrenCloud Inc. is a leading Data Engineering company that specializes in solving complex challenges in the Supply Chain Management industry. We have a team of over 350 employees and a robust revenue of $25M+. Our mission is to empower organizations with cutting-edge software engineering solutions that optimize operations, harness supply chain intelligence, and drive sustainable growth. We value both growth and employee well-being, striving to maintain a positive work environment while providing opportunities for professional development. Job Summary This role involves coordinating cross-functional teams, managing timelines and resources, and ensuring project goals align with the organization s objectives. The ideal candidate is an excellent communicator, highly organized, and skilled at stakeholder management. Key Responsibilities Project Planning : Define project scope, objectives, and deliverables in collaboration with stakeholders, ensuring alignment with organizational goals. Resource Management : Allocate resources and coordinate team members to ensure efficient project execution. Timeline Management : Develop and maintain project schedules, track milestones, and ensure timely completion of deliverables. Stakeholder Communication : Act as the primary point of contact for stakeholders, providing regular updates on project progress, risks, and outcomes. Risk Management : Identify potential risks and develop mitigation strategies to keep projects on track. Team Coordination : Facilitate collaboration among cross-functional teams, ensuring clarity of roles and responsibilities. Documentation : Maintain comprehensive project documentation, including plans, reports, and status updates. Quality Assurance : Ensure deliverables meet quality standards and stakeholder expectations. Change Management : Manage scope changes and communicate impacts to stakeholders effectively. Post-Project Evaluation : Conduct project reviews to assess outcomes, gather feedback, and identify lessons learned for future projects. Qualifications Education : Bachelor s degree in business administration, management, Communications, or a related field. A Master s degree or PMP certification is a plus. Experience : 2-5 years of project management experience in non-technical fields such as marketing, operations, event planning, or business administration. Skills : Strong organizational and time-management skills. Excellent verbal and written communication skills. Proficiency in project management tools (e.g., Trello, Asana, Microsoft Project, or similar). Ability to manage multiple projects simultaneously in a fast-paced environment. Strong problem-solving and decision-making abilities. Experience with budget management and resource allocation. Familiarity with risk management and change management processes. Personal Attributes : Proactive, detail-oriented, and adaptable. Strong interpersonal skills with the ability to build relationships and influence stakeholders. Ability to work independently and as part of a team. Preferred Qualifications Certification in project management (e.g., PMP, CAPM, PRINCE2, or Agile). Experience in [specific industry, e.g., marketing, consulting, event management, etc.]. Familiarity with data analysis or reporting tools (e.g., Excel, Google Analytics) for tracking project metrics.
Company Overview At Syren Cloud , we are at the forefront of cloud innovation, providing comprehensive cloud solutions that drive efficiency and growth. We are on the lookout for an experienced Dot Net Full Stack Developer with a passion for technology and a knack for problem-solving to join our team. Role Short Summary As a Dot Net Full Stack Developer at Syren Cloud, you will be instrumental in developing high-performance web applications. You will collaborate with cross-functional teams to engineer contemporary solutions that integrate seamlessly with our cloud services. Responsibilities Architect, design, and build complex, highly scalable, high-performance enterprise applications Design, develop, test, release, and maintain components of software (both frontend and backend) Functionally decompose complex problems into simple solutions. Apply expert full-stack knowledge in feature creation and enhancement, performance, scalability, security, and engineering best practices. Knowledge of system interdependencies, limitations, and mitigating risks. Collaborate with cross-functional teams to release features. Should act as SME for both frontend and backend systems. Accelerate development velocity for all engineers and deliver continuous improvements to the teams process Ensure quality of the software by implementing best practices in the team Lead, mentor, and guide a team of engineers to deliver Software and meet clients expectations Requirements A bachelors degree in computer science or equivalent 5+ years of hands-on experience in programming in C# and .Net Core 2+ Years of hands-on experience in building SPA applications using React Must have experience developing Restful services using .Net Core Web API, Must have experience in ORM Entity Framework/Dapper/etc. Deep understanding of ASP.NET Core, Design patterns, OOPS Concepts Should have good knowledge and working experience on building Microservices applications using any Cloud Platform. Expertise with JavaScript, HTML5, CSS3, and writing cross-browser code Strong working experience with MSSQL and TSQL. Ability to independently deliver complex development projects Excellent written and oral communication skills Should have experience in Unit Testing using MOQ, Fakes, TDD, Swagger, and React using Jasmine. Should have experience on Source control systems GitHub, Bitbucket, GitLab, etc. Good to have experience in NoSQL databases like MongoDB or Cosmos DB. Good to have experience in MSMQ, RabbitMQ, and Azure Service Bus Experience with Azure is a plus
Job Title: Databricks Data Engineer Experience Level: 3 to 6 Years Location: Ahmedabad Employment Type: Full-Time Certifications Required: Databricks Certified Data Engineer Associate Databricks Certified Data Engineer Professional Cloud Certifications (Preferred): Azure, AWS, GCP Job Summary: We are seeking a highly skilled and certified Databricks Data Engineer to join our dynamic data engineering team. The ideal candidate will have hands-on experience in implementing Lakehouse architectures, upgrading to Unity Catalog, and building robust data ingestion pipelines. This role demands proficiency in Python, PySpark, SQL, and Scala, along with a strong understanding of big data technologies, streaming workflows, and multi-cloud environments. Key Responsibilities: Lakehouse Implementation: Design and implement scalable Lakehouse architecture using Databricks. Optimize data storage and retrieval strategies for performance and cost-efficiency. Unity Catalog Upgrade: Lead and execute Unity Catalog upgrades across Databricks workspaces. Ensure secure and compliant data governance and access control. Data Ingestion & Migration: Develop and maintain data ingestion pipelines from various sources (structured, semi-structured, unstructured). Perform large-scale data migrations across cloud platforms and environments. Pipeline Development: Build and manage ETL/ELT pipelines using PySpark and SQL. Ensure data quality, reliability, and performance across workflows. Big Data Streaming & Workflows: Implement real-time data streaming solutions using Spark Structured Streaming or similar technologies. Design workflow orchestration using tools like Airflow, Databricks Workflows, or equivalent. Multi-Cloud Expertise (Preferred): Work across Azure, AWS, and GCP environments. Understand cloud-native services and integration patterns for data engineering. Collaboration & Documentation: Collaborate with data scientists, analysts, and business stakeholders to understand data requirements. Document technical designs, data flows, and operational procedures. Required Skills & Qualifications: 3 6 years of hands-on experience in data engineering roles. Strong expertise in Databricks platform and ecosystem. Proficiency in Python, PySpark, SQL, and Scala. Experience with Lakehouse architecture and Unity Catalog. Proven track record in building scalable data ingestion pipelines and performing data migrations. Familiarity with big data technologies such as Delta Lake, Apache Spark, Kafka, etc. Understanding of data lake concepts and best practices. Experience with streaming data and workflow orchestration. Certified as Databricks Data Engineer Associate and Professional (mandatory). Cloud certifications in Azure, AWS, or GCP are a strong plus. Excellent problem-solving skills and attention to detail. Strong communication and collaboration abilities. Nice to Have: Experience with CI/CD pipelines and DevOps practices in data engineering. Exposure to data cataloging tools and metadata management. Knowledge of data security, privacy, and compliance standards. Why Join Us Work on cutting-edge data engineering projects in a fast-paced environment. Collaborate with a team of passionate professionals. Opportunity to grow and expand your skills across multiple cloud platforms. Competitive compensation and benefits.