We are seeking a highly skilled and experienced C++ Developer to join a dynamic team, focusing on developing high-performance applications on Linux platforms. You will be involved in all phases of the software development lifecycle, from design and implementation to testing and deployment, with a strong emphasis on multithreading and system-level programming. This role requires a proactive individual with excellent problem-solving abilities and a commitment to code quality. Roles & Responsibilities: Design, develop, and maintain robust and efficient C++ applications on Linux . Implement and optimize multithreaded applications to maximize performance and scalability. Write clean, well-documented, and testable code. Participate in the full software development lifecycle (SDLC) , including requirements gathering, design, coding, testing, and deployment. Debug and troubleshoot complex software issues, identifying and implementing effective solutions. Optimize application performance through techniques such as memory management and loop optimization. Collaborate with cross-functional teams, including software engineers, QA, and project managers. Contribute to architectural design and discussions. Stay up-to-date with the latest C++ standards and best practices . Utilize and integrate with source code management tools. Employ Agile/Scrum methodologies in the development process. Work with Docker and containerization technologies for application deployment. Skills Required: Strong hands-on experience in C++ programming , with a deep understanding of memory management, file I/O, and streams concepts. Solid expertise in multithreading , including creating and managing threads, synchronization mechanisms (mutexes, condition variables), and kernel-level understanding. Proficiency in developing and troubleshooting applications on Linux , with a deep understanding of command-line tools, POSIX standards, processes, and networking. Strong understanding of software architecture principles and experience in building applications within a C++ environment. Familiarity with source code management tools (e.g., Git, ClearCase) and integrating them with IDEs. Experience with Agile and Scrum methodologies . Knowledge of developing web applications on the C++ platform is a plus. Proven experience in debugging, troubleshooting, and performance optimization techniques. Understanding of Docker and containerization technologies . Excellent written and verbal communication skills. Strong interpersonal skills, a positive and proactive attitude, and the ability to make sound judgments. QUALIFICATION: Bachelor's degree in Computer Science, or a related field, or equivalent practical experience.
Key Responsibilities: Quality Assurance & Testing Design, develop, and execute manual and automated test cases for Salesforce applications (Sales Cloud, Service Cloud, etc.). Perform functional, regression, smoke, and integration testing across Salesforce modules. Validate Salesforce components such as workflows, triggers, process builders, approval processes , and third-party integrations. Track and manage defects using JIRA or equivalent test management tools. Maintain test documentation , including test plans, test cases, traceability matrices, and coverage reports. Provar Automation Build, execute, and maintain test automation scripts using Provar for Salesforce web UI and API testing. Integrate Provar with CI/CD tools such as Jenkins, GitHub Actions, or Azure DevOps for continuous testing. Drive test automation coverage and identify opportunities to enhance QA effectiveness through automation. Salesforce Admin Support Support Salesforce configuration tasks, including custom objects, fields, profiles, layouts, and validation rules . Collaborate with Salesforce Admins and Developers to support testing and deployment activities . Assist in sandbox management , data loading using tools like Data Loader or Workbench , and creation of test data for QA environments. Required Skills & Qualifications: 56+ years of hands-on experience in Salesforce QA (manual and automated testing). Mandatory : Strong experience with Provar Automation Tool . In-depth knowledge of Salesforce platform , including Sales Cloud, Service Cloud, and CPQ (preferred). Experience in Salesforce admin/configuration . Familiarity with testing REST/SOAP APIs via Postman or Provar. Proficient in defect management tools like JIRA . Experience working in Agile/Scrum environments. Salesforce Administrator Certification (ADM-201) is a plus. Preferred Traits: Strong attention to detail and commitment to quality and process improvement . Ability to work independently in a fast-paced and remote team environment . Effective communication and collaboration skills to interact with cross-functional stakeholders . Proactive approach to identifying gaps in testing coverage and proposing robust solutions.
Required Skills & Experience : of hands-on experience as an AWS Engineer or similar role. Deep expertise in AWS Services : Lambda, API Gateway, S3, DynamoDB, Step Functions, SQS, AppSync, CloudWatch Logs, X-Ray, EventBridge, Amazon Pinpoint, Cognito, KMS. Proficiency in Infrastructure as Code (IaC) with AWS CDK; experience with CodePipeline is a significant plus. Extensive experience with Serverless Architecture & Event-Driven Design. Strong understanding of Cloud Monitoring & Observability tools : CloudWatch Logs, X-Ray, Custom Metrics. Proven ability to implement and enforce Security & Compliance measures, including IAM roles boundaries, PHI/PII tagging, Cognito, KMS, HIPAA standards, Isolation Pattern, and Access Control. Demonstrated experience with Cost Optimization techniques (S3 lifecycle policies, serverless tiers, service selection). Expertise in designing and implementing Scalability & Resilience patterns (auto-scaling, DLQs, retry/backoff, circuit breakers). Familiarity with CI/CD Pipeline Concepts. Excellent Documentation & Workflow Design skills. Exceptional Cross-Functional Collaboration abilities. Commitment to implementing AWS Best Practices.
Primary Responsibilities : SLT Implementation & Management : Lead the setup, configuration, and management of SLT for real-time or batch data replication between SAP and non-SAP systems Ensure seamless ETL (Extract, Transform, Load) workflows with minimal disruption to source or target systems. Non-SAP to SAP Migration : Oversee data migration from non-SAP platforms to SAP using SLT, ensuring data integrity, consistency, and quality. Plan and execute cutovers and validations during migration activities. Central Finance Support : Configure, monitor, and optimize SLT-based replication for Central Finance (CFIN) projects. Support financial data synchronization across systems by aligning SLT settings with Central Finance architecture. SAP Basis and Data Integration : Collaborate with SAP BASIS team to maintain system health, configuration, and SLT performance tuning. Integrate and manage Datahub/Magnitude solutions to streamline cross-system data orchestration. Performance & Troubleshooting : Monitor SLT replication processes and resolve issues related to performance, job failures, and transformation errors. Implement performance improvements and ensure high availability of replication services. Documentation & Best Practices : Maintain comprehensive technical documentation for SLT configurations, troubleshooting guides, and migration procedures. Follow and enforce SAP best practices for secure and efficient system integrations. Required Qualifications & Skills : 7+ years of total SAP experience, including : 3+ years in SAP Landscape Transformation (SLT). 3+ years in SAP Basis administration and performance support. Expertise in SLT configuration, data replication setup, and real-time monitoring. Experience in non-SAP to SAP data migrations and Central Finance integrations. Strong knowledge of SAP CFIN architecture and data mapping scenarios. Hands-on experience with Datahub or Magnitude tools for large-scale data movement. Proficiency in system health checks, transport management, and SLT-related troubleshooting. Excellent communication, documentation, and cross-functional collaboration skills. Preferred Qualifications : SAP Certification in Basis, SLT, or Central Finance is an added advantage. Experience working in hybrid cloud/on-premise SAP environments. Exposure to SAP S/4HANA migration projects.
Design, build, and maintain scalable and efficient data pipelines and ETL processes. Develop and optimize Clickhouse databases for high-performance analytics. Create RESTful APIs using FastAPI to expose data services. Work with Kubernetes for container orchestration and deployment of data services. Write complex SQL queries to extract, transform, and analyze data from PostgreSQL and Clickhouse. Collaborate with data scientists, analysts, and backend teams to support data needs and ensure data quality. Monitor, troubleshoot, and improve performance of data infrastructure. Strong experience in Clickhouse - data modeling, query optimization, performance tuning. Expertise in SQL - including complex joins, window functions, and optimization. Proficient in Python, especially for data processing (Pandas, NumPy) and scripting. Experience with FastAPI for creating lightweight APIs and microservices. Hands-on experience with PostgreSQL - schema design, indexing, and performance. Solid knowledge of Kubernetes managing containers, deployments, and scaling. Understanding of software engineering best practices (CI/CD, version control, testing). Experience with cloud platforms like AWS, GCP, or Azure. Knowledge of data warehousing and distributed data systems. Familiarity with Docker, Helm, and monitoring tools like Prometheus/Grafana.
We are seeking a proactive Power BI Developer with strong expertise in SQL and Figma . You will be responsible for designing and developing comprehensive Power BI reports and dashboards, collaborating with stakeholders to understand reporting needs, and utilizing SQL for data extraction and manipulation. This role requires excellent analytical skills and a commitment to quality to derive actionable insights and enhance user experience. Roles & Responsibilities: Develop and design comprehensive Power BI reports and dashboards . Collaborate with stakeholders to understand reporting needs and translate them into functional requirements. Create visually appealing interfaces using Figma for enhanced user experience. Utilize SQL for data extraction and manipulation to support reporting requirements. Implement DAX measures to ensure accurate data calculations. Conduct data analysis to derive actionable insights and facilitate decision-making. Perform user acceptance testing (UAT) to validate report performance and functionality. Provide training and support for end-users on dashboards and reporting tools. Monitor and enhance the performance of existing reports on an ongoing basis. Work closely with cross-functional teams to align project objectives with business goals. Maintain comprehensive documentation for all reporting activities and processes. Stay updated on industry trends and best practices related to data visualization and analytics. Ensure compliance with data governance and security standards. Skills Required: Strong proficiency in SQL and database management. Extensive knowledge of data visualization best practices . Expertise in DAX for creating advanced calculations. Proven experience in designing user interfaces with Figma . Excellent analytical and problem-solving skills. Ability to communicate complex data insights to non-technical stakeholders. Strong attention to detail and commitment to quality. Experience with business analytics and reporting tools. Familiarity with data governance and compliance regulations. Ability to work independently and as part of a team. Strong time management skills and ability to prioritize tasks. Ability to adapt to fast-paced working environments. Strong interpersonal skills and stakeholder engagement capability. Relevant certifications in Power BI or data analytics are a plus. QUALIFICATION: Bachelor's degree in Computer Science, Information Technology, or a related field, or equivalent practical experience.
Key Responsibilities Administer, troubleshoot, and maintain Microsoft Exchange environments (2010, 2013, 2016, 2019). Manage Office 365 environments including mailbox migrations, policy configurations, and compliance enforcement. Implement and maintain Cisco IronPort email security solutions. Oversee hybrid mail flow configurations, connectors, and coexistence between on-premises Exchange and Exchange Online. Resolve technical issues related to OWA, ECP, Autodiscover, Database Availability Groups (DAG), SSL certificates, and IIS. Manage user and group accounts, including distribution lists, public folders, and shared mailboxes. Support Azure AD Connect, ADFS, and Single Sign-On (SSO) integrations. Develop and execute PowerShell scripts to automate Exchange and Office 365 tasks. Maintain email retention policies, eDiscovery workflows, and compliance configurations. Provide Level 2 and Level 3 support for escalated messaging incidents. Technical Skills Required Expertise in Microsoft Exchange Server (2010/2013/2016/2019). Office 365 and Exchange Online administration. Experience with Cisco IronPort Email Security. Proficient with Azure Active Directory and Azure AD Connect. Skilled in hybrid mail flow configuration and troubleshooting. PowerShell scripting for Exchange and Office 365 automation. Strong understanding of SMTP, MAPI, POP3, IMAP, and Outlook Anywhere protocols. SSL certificate and IIS management and troubleshooting. Familiarity with Windows Server 2012 and 2016 environments. Nice to Have Microsoft 365 Certifications such as MS-101, MS-200, or 70-347. Experience with email gateways and Menlo Security. Knowledge of retention labels and policies in Office 365. Experience with Quest migration tools. Behavioral Traits Excellent customer communication and incident management skills. Ability to work independently and handle critical incident resolution. Strong attention to detail and a proactive approach to troubleshooting.
Key Responsibilities: Handle product sales and client interactions. Assist in payment recovery and follow-ups. Understand customer requirements and recommend appropriate product solutions. Meet monthly sales targets and contribute to business growth. Build and maintain strong customer relationships. Requirements: ITI or Diploma in Mechanical Engineering. 14 years of experience in Sales or Recovery. Strong communication and negotiation skills. Basic understanding of refrigeration and mechanical systems is an advantage. Self-motivated, organized, and able to work independently.
We are seeking a highly skilled Power BI Consultant to collaborate with stakeholders and translate reporting requirements into interactive visualizations. You will be responsible for designing and developing Power BI reports and dashboards, creating wireframes and prototypes using Figma , and conducting data analysis to provide actionable insights. This role requires strong proficiency in SQL and a solid understanding of data modeling and visualization best practices. Roles & Responsibilities: Collaborate with stakeholders to understand reporting requirements and translate them into interactive visualizations. Design and develop Power BI reports and dashboards that provide actionable insights to the business. Create detailed wireframes and prototypes using Figma to effectively communicate design ideas. Implement best practices for data visualization and ensure reports are intuitive and user-friendly. Develop and maintain data models for Power BI to support analytical processes. Conduct data analysis to identify trends and patterns that drive business decisions. Provide training and support to end-users regarding dashboard functionalities. Work with cross-functional teams to gather requirements and feedback for continuous improvement. Test and validate data accuracy and integrity across all reports and dashboards. Implement data governance best practices to ensure compliance and security. Assist in project management activities to ensure the timely delivery of projects. Create documentation for report development processes and user guides. Support ad-hoc reporting requests as needed by stakeholders. Contribute to a positive team environment by mentoring junior staff and sharing knowledge. Skills Required: Strong knowledge of SQL and data querying languages. Proficiency in Figma for UI/UX design. Strong understanding of wireframing principles and design thinking. Hands-on experience with data analysis and data modeling. Excellent problem-solving abilities with a keen eye for detail. Strong communication skills and the ability to engage with stakeholders. Experience in working within an Agile project environment. Familiarity with DAX and Power Query . Experience with data governance practices. Ability to provide effective user training and support. Solid understanding of business intelligence tools and methodologies. Self-motivated and able to work independently. QUALIFICATION: Bachelor's degree in Computer Science, Data Science, or a related field
Role Overview : A seasoned Flexera Tooling Implementation Specialist responsible for the end-to-end implementation, configuration, and management of the Flexera Software Asset Management (SAM) tool. Key Responsibilities : Configure and install the Flexera SAM tool. Ensure high availability and optimal performance of the Flexera environment. Upload, manage, and maintain accurate data within the platform. Perform regular system maintenance, including backups, patches, and upgrades. Administer user roles and access. Continuously monitor system health and proactively resolve technical issues. Manage infrastructure for the SAM solution. Escalate and manage vendor support tickets. Maintain all data feeds and system integrations. Perform scheduled and ad-hoc data uploads. Upgrade the Flexera platform and its modules. Required Qualifications : 10+ years of experience in IT tools implementation, with a focus on SAM tools. Proven hands-on experience with on-premise and cloud Flexera SAM. Deep understanding of Flexera's modules, data connectors, and architecture. Technical expertise in Flexera features, limitations, and best practices. Skilled in designing frameworks to improve tool maturity. Ability to create tailored implementation plans. Strong analytical and problem-solving skills. Experience working with cross-functional teams. Nice to Have : Flexera certifications or formal training. Experience with integration of Flexera SAM with ITSM or CMDB platforms (e.g., ServiceNow). Familiarity with license compliance, software metering, and audit support.
Responsibilities : Review and Analyze : Conduct thorough reviews and in-depth analysis of existing SAS data extract programs to assess their accuracy, efficiency, and completeness Data Remediation : Proactively identify, retrieve, and recode broken or non-performing data segments within SAS programs to restore functionality and data integrity. Collaboration & Validation : Work closely with data analysts, business teams, and Quality Assurance (QA) to validate program outputs and ensure all business rules are precisely implemented and met. Documentation : Maintain meticulous documentation of all changes made to SAS programs, including logic, methodologies, and recommendations for future reference and audit purposes. Optimization : Provide expert suggestions and implement improvements for optimizing existing SAS processes to enhance performance and efficiency. Required Skills & Qualifications : Primary Skill : Extensive hands-on experience as a SAS Developer. Additional Skills : Proficiency in C# development. Strong command of SQL for data manipulation and querying. Proven ability to retrieve and recode broken data effectively. communication : Excellent written and verbal communication skills are essential for collaborating with cross-functional teams and documenting complex processes.
This is a highly specialized, hands-on role focused on building and deploying generative AI solutions. It requires someone who can not only work with AI models but also integrate them seamlessly and securely into existing enterprise software. Generative AI & LLM Integration: The core responsibility is integrating Gemini models into corporate platforms like Slack and Confluence. This involves hands-on development, prompt engineering, and the deployment of large language models (LLMs) in a production environment. AI Orchestration & MLOps: A key part of the job is building the infrastructure that makes the AI work. This includes managing orchestration logic , setting up embedding pipelines , and ensuring all components, from the prompt to the data retrieval, work together smoothly. Vector Databases & Data Engineering: You must be proficient with vector databases (like Pinecone or Weaviate) and understand the process of creating embeddings from structured and unstructured data. This is crucial for enabling the AI to retrieve relevant information from a company's internal documentation. API & System Integration: The role requires strong technical skills to connect various platforms. You'll need to set up API authentication and role-based access controls to ensure the AI assistants can securely access data from systems like Looker and Confluence. Agile Development: You will be working in a sprint-based Agile environment, so familiarity with concepts like daily standups, sprint demos, and user acceptance testing is essential for managing projects and meeting deadlines.
We are seeking a highly skilled and experienced Senior Data Engineer to join our dynamic team, with a strong emphasis on leveraging Dataiku's capabilities . You will be instrumental in designing, developing, and optimizing robust data pipelines, ensuring seamless integration of diverse data sources, and maintaining high data quality. This role requires a unique blend of expertise in data engineering, advanced data modeling, and a forward-thinking approach to integrating cutting-edge AI technologies, particularly for Generative AI applications. Roles & Responsibilities: Drive data engineering initiatives with a strong emphasis on leveraging Dataiku for data preparation, analysis, visualization, and the deployment of data solutions. Design, develop, and optimize robust and scalable data pipelines to support business intelligence and advanced analytics projects. Apply expertise in data modeling techniques to design efficient and scalable database structures, ensuring data integrity and optimal performance. Implement and manage ETL/ELT processes and tools to ensure efficient and reliable data flow, maintaining high data quality and accessibility. Explore and implement solutions leveraging LLM Mesh for Generative AI applications. Utilize programming languages such as Python and SQL for data manipulation, analysis, and automation. Deploy and manage scalable data solutions on cloud platforms such as AWS or Azure . Ensure high data quality, consistency, and accessibility across all data assets, and implement data governance best practices. Collaborate closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver impactful solutions. Continuously monitor and optimize the performance of data pipelines and data systems. Skills Required: Demonstrable expertise in Dataiku for data preparation, analysis, visualization, and building end-to-end data pipelines. Strong understanding and practical experience in various data modeling techniques (e.g., dimensional modeling, Kimball, Inmon). Extensive experience with ETL/ELT processes and tools (e.g., Dataiku's built-in capabilities, Apache Airflow, Talend, SSIS). Familiarity with LLM Mesh or similar frameworks for Generative AI applications. Strong proficiency in Python and SQL for data manipulation, scripting, and complex querying. Knowledge and hands-on experience with at least one major cloud platform ( AWS or Azure ) for deploying and managing scalable data solutions. Basic understanding of Generative AI concepts and their potential applications in data engineering. Experience with other big data technologies ( Spark, Hadoop, Snowflake ) is a plus. Familiarity with data governance, data security best practices, and MLOps principles and tools is a plus. Excellent analytical, problem-solving, and communication skills. QUALIFICATION: Bachelor's or Master's degree in Computer Science, Data Science, Engineering, or a related quantitative field.
We are seeking a skilled and proactive Qlik Sense Developer with experience in designing and developing interactive dashboards and data visualizations. You will be responsible for the full lifecycle of Qlik Sense application development, from gathering requirements and collaborating with stakeholders to creating and maintaining reports and data models. This role requires a strong understanding of Qlik scripting , data modeling, and data visualization best practices to deliver scalable BI solutions. Roles & Responsibilities: Design, develop, and maintain Qlik Sense dashboards, reports, and data models . Create interactive visualizations to effectively communicate data insights. Develop and implement data extraction, transformation, and loading ( ETL ) processes using Qlik scripting . Optimize data models and dashboards for performance, scalability, and usability. Work closely with business stakeholders to gather and document business requirements. Translate business requirements into technical specifications and data models. Collaborate with cross-functional teams, including data engineers, database administrators, and business analysts. Integrate Qlik Sense with various data sources, including relational databases, flat files, cloud platforms, and APIs. Ensure data accuracy, integrity, consistency, and security. Provide ongoing support, troubleshooting, and maintenance for existing Qlik Sense solutions. Create and maintain technical documentation, including data flow diagrams and user guides. Adhere to Qlik Sense development best practices and coding standards. Stay up-to-date with the latest Qlik Sense features and industry trends. Skills Required: Strong knowledge of Qlik Scripting , data modeling techniques (star schema, snowflake schema), and data visualization best practices. Proficiency in SQL and relational database concepts. Experience in connecting Qlik Sense to various data sources (e.g., SQL databases, Excel, CSV, APIs). Strong analytical and problem-solving skills with the ability to translate complex business requirements into technical solutions. Excellent communication and interpersonal skills. Experience with Qlik NPrinting for report distribution and automation is highly desirable. Knowledge of QlikView is a plus. Experience with cloud platforms (e.g., AWS, Azure, GCP) and cloud-based data warehousing solutions is a plus. Familiarity with Agile development methodologies . Basic understanding of data warehousing concepts, ETL processes, and dimensional modeling. Qlik Sense certifications are a plus. QUALIFICATION: Bachelor's degree in Computer Science, Information Systems, or a related field.
Job Summary As a Power BI Developer, you will leverage your expertise in data visualization, DAX/SQL, and UI/UX design principles using Figma to build intuitive and insightful dashboards. You will work closely with business stakeholders to gather requirements, prototype user interfaces, design scalable data models, and deliver actionable reporting solutions. Your role will be critical in transforming complex data into user-friendly, performance-optimized, and secure visualizations within the Power BI ecosystem. Key Responsibilities Dashboard Design and Development Design, develop, and deploy interactive dashboards and visual reports using Power BI Desktop and Power BI Service. UI/UX Prototyping with Figma Collaborate with users to translate reporting needs into wireframes, mockups, and prototypes using Figma. Convert Figma designs into production-ready Power BI dashboards that adhere to modern design and UX standards. DAX Development Develop and optimize complex DAX calculations, including measures, calculated columns, and time intelligence functions. SQL Querying and Optimization Write and optimize complex SQL queries using joins, window functions, CTEs, and stored procedures across various databases. Data Modeling Design efficient and scalable data models using dimensional modeling concepts (star and snowflake schemas) and best practices. Security Implementation Implement row-level security (RLS) and other access controls to ensure data protection within Power BI solutions. Performance Tuning Optimize Power BI reports and data models for speed, responsiveness, and usability. Data Source Integration Connect Power BI to diverse data sources including SQL Server, Azure Synapse, APIs, and cloud databases. Stakeholder Communication Present reports and insights to business users clearly and effectively, bridging technical and business understanding. Requirements Gathering Lead and participate in requirement gathering sessions through interviews, workshops, and documentation analysis. Agile Collaboration Contribute to Agile/Scrum teams by participating in sprint planning, daily standups, retrospectives, and timely task delivery. Documentation Maintain comprehensive technical documentation for dashboards, data models, and processes. Continuous Improvement Stay updated with new Power BI and Figma features, and implement improvements to enhance dashboard functionality and aesthetics. Required Skills Minimum of 7 years of experience in Business Intelligence and Power BI development. Strong hands-on experience with Power BI Desktop and Service. Proficiency in DAX with advanced calculation and time intelligence functions. Proficient in Figma for creating wireframes, mockups, and design-to-dashboard translation. Advanced SQL skills with deep experience in joins, CTEs, window functions, and stored procedures. Solid understanding of data warehousing principles and dimensional modeling (star/snowflake schemas). Experience integrating Power BI with cloud-based platforms (e.g., Azure Synapse, SQL Server on Azure) is preferred. Skilled in business requirements elicitation and solution design. Excellent written and verbal communication skills. Strong problem-solving capabilities and analytical thinking. Proven experience working in Agile/Scrum environments. Ability to work collaboratively in cross-functional teams. Education Bachelor's degree in Computer Science, Information Systems, Data Science, or a related field.
We are seeking a proactive Power BI Developer with strong expertise in SQL and Figma . You will be responsible for designing and developing comprehensive Power BI reports and dashboards, collaborating with stakeholders to understand reporting needs, and utilizing SQL for data extraction and manipulation. This role requires excellent analytical skills and a commitment to quality to derive actionable insights and enhance user experience. Roles & Responsibilities: Develop and design comprehensive Power BI reports and dashboards . Collaborate with stakeholders to understand reporting needs and translate them into functional requirements. Create visually appealing interfaces using Figma for enhanced user experience. Utilize SQL for data extraction and manipulation to support reporting requirements. Implement DAX measures to ensure accurate data calculations. Conduct data analysis to derive actionable insights and facilitate decision-making. Perform user acceptance testing (UAT) to validate report performance and functionality. Provide training and support for end-users on dashboards and reporting tools. Monitor and enhance the performance of existing reports on an ongoing basis. Work closely with cross-functional teams to align project objectives with business goals. Maintain comprehensive documentation for all reporting activities and processes. Stay updated on industry trends and best practices related to data visualization and analytics. Ensure compliance with data governance and security standards. Skills Required: Strong proficiency in SQL and database management. Extensive knowledge of data visualization best practices . Expertise in DAX for creating advanced calculations. Proven experience in designing user interfaces with Figma . Excellent analytical and problem-solving skills. Ability to communicate complex data insights to non-technical stakeholders. Strong attention to detail and commitment to quality. Experience with business analytics and reporting tools. Familiarity with data governance and compliance regulations. Ability to work independently and as part of a team. Strong time management skills and ability to prioritize tasks. Ability to adapt to fast-paced working environments. Strong interpersonal skills and stakeholder engagement capability. Relevant certifications in Power BI or data analytics are a plus. QUALIFICATION: Bachelor's degree in Computer Science, Information Technology, or a related field, or equivalent practical experience.
Responsibilities : Review and Analyze : Conduct thorough reviews and in-depth analysis of existing SAS data extract programs to assess their accuracy, efficiency, and completeness Data Remediation : Proactively identify, retrieve, and recode broken or non-performing data segments within SAS programs to restore functionality and data integrity. Collaboration & Validation : Work closely with data analysts, business teams, and Quality Assurance (QA) to validate program outputs and ensure all business rules are precisely implemented and met. Documentation : Maintain meticulous documentation of all changes made to SAS programs, including logic, methodologies, and recommendations for future reference and audit purposes. Optimization : Provide expert suggestions and implement improvements for optimizing existing SAS processes to enhance performance and efficiency. Required Skills & Qualifications : Primary Skill : Extensive hands-on experience as a SAS Developer. Additional Skills : Proficiency in C# development. Strong command of SQL for data manipulation and querying. Proven ability to retrieve and recode broken data effectively. communication : Excellent written and verbal communication skills are essential for collaborating with cross-functional teams and documenting complex processes.
Responsibilities : Develop and maintain data pipelines using GCP. Write and optimize queries in BigQuery. Utilize Python for data processing tasks. Manage and maintain SQL Server databases. Must-Have Skills : Experience with Google Cloud Platform (GCP). Proficiency in BigQuery query writing. Strong Python programming skills. Expertise in SQL Server. Good to Have : Knowledge of MLOps practices. Experience with Vertex AI. Background in data science. Familiarity with any data visualization tool
Key Responsibilities Design and implement scalable data models using Snowflake and Erwin Data Modeler Build and maintain efficient data pipelines using DBT and GCP services like BigQuery, Cloud Storage, and Dataflow Reverse engineer legacy systems (e.g., Sailfish, DDMS) using tools like DBeaver to analyze and recreate data models Write and optimize SQL queries and stored procedures for data transformation, validation, and performance Collaborate with business analysts and stakeholders to translate requirements into logical and physical data models Optimize Snowflake data warehouse performance, enforce security, and ensure data quality Document metadata, data lineage, and business rules associated with data models and pipelines Participate in code reviews and help define data modeling and governance best practices Must-Have Skills Experience with Snowflake architecture, schema design, and performance tuning Hands-on expertise with DBT for data transformation and modeling Advanced SQL skills, including query optimization, window functions, and complex joins Proficiency with Erwin Data Modeler (logical and physical modeling) Experience working with GCP tools like BigQuery, Cloud Composer, and Cloud Storage Ability to reverse engineer data systems using tools such as DBeaver (e.g., for Sailfish, DDMS) Good To Have Familiarity with CI/CD tools and DevOps practices for data environments Understanding of data governance, data security, and compliance principles Exposure to Agile methodologies and collaboration in distributed teams Knowledge of Python for data-related scripting and automation Soft Skills Strong analytical and problem-solving abilities Effective communication and stakeholder management Self-motivated and capable of working independently in a remote environment
We are looking for a skilled professional with expertise in developing and implementing e-commerce solutions using Adobe Commerce Cloud. The ideal candidate will be adept at building scalable and efficient applications, collaborating with cross-functional teams, and maintaining high-quality code. Roles and Responsibilities Design, develop, and deploy scalable and efficient e-commerce applications using Adobe Commerce Cloud. Collaborate with cross-functional teams to identify business requirements and develop technical solutions. Develop and maintain high-quality code that meets industry standards and best practices. Troubleshoot and resolve technical issues related to e-commerce applications. Implement security measures to protect sensitive data and ensure compliance with industry regulations. Stay updated with the latest trends and technologies in e-commerce development. Skills Required Proficient in programming languages such as Java, PHP, or Python. Experience with e-commerce platforms like Magento, Shopify, or Woo Commerce. Strong understanding of database management systems like MySQL or MongoDB. Familiarity with front-end development frameworks like React or Angular. Excellent problem-solving skills and attention to detail. Ability to work collaboratively in a team environment and communicate effectively with stakeholders.