Jobs
Interviews

Cymetrix

7 Job openings at Cymetrix
GCP Data Modeller Chennai,Tamil Nadu,India 5 years Not disclosed On-site Full Time

Bangalore / Chennai Hands-on data modelling for OLTP and OLAP systems In-depth knowledge of Conceptual, Logical and Physical data modelling Strong understanding of Indexing, partitioning, data sharding, with practical experience of having done the same Strong understanding of variables impacting database performance for near-real-time reporting and application interaction. Should have working experience on at least one data modelling tool, preferably DBSchema, Erwin Good understanding of GCP databases like AlloyDB, CloudSQL, and BigQuery. People with functional knowledge of the mutual fund industry will be a plus Role & Responsibilities Work with business users and other stakeholders to understand business processes. Ability to design and implement Dimensional and Fact tables Identify and implement data transformation/cleansing requirements Develop a highly scalable, reliable, and high-performance data processing pipeline to extract, transform and load data from various systems to the Enterprise Data Warehouse Develop conceptual, logical, and physical data models with associated metadata including data lineage and technical data definitions Design, develop and maintain ETL workflows and mappings using the appropriate data load technique Provide research, high-level design, and estimates for data transformation and data integration from source applications to end-user BI solutions. Provide production support of ETL processes to ensure timely completion and availability of data in the data warehouse for reporting use. Analyze and resolve problems and provide technical assistance as necessary. Partner with the BI team to evaluate, design, develop BI reports and dashboards according to functional specifications while maintaining data integrity and data quality. Work collaboratively with key stakeholders to translate business information needs into well-defined data requirements to implement the BI solutions. Leverage transactional information, data from ERP, CRM, HRIS applications to model, extract and transform into reporting & analytics. Define and document the use of BI through user experience/use cases, prototypes, test, and deploy BI solutions. Develop and support data governance processes, analyze data to identify and articulate trends, patterns, outliers, quality issues, and continuously validate reports, dashboards and suggest improvements. Train business end-users, IT analysts, and developers. Required Skills Bachelor’s degree in Computer Science or similar field or equivalent work experience. 5+ years of experience on Data Warehousing, Data Engineering or Data Integration projects. Expert with data warehousing concepts, strategies, and tools. Strong SQL background. Strong knowledge of relational databases like SQL Server, PostgreSQL, MySQL. Strong experience in GCP & Google BigQuery, Cloud SQL, Composer (Airflow), Dataflow, Dataproc, Cloud Function and GCS Good to have knowledge on SQL Server Reporting Services (SSRS), and SQL Server Integration Services (SSIS). Knowledge of AWS and Azure Cloud is a plus. Experience in Informatica Power exchange for Mainframe, Salesforce, and other new-age data sources. Experience in integration using APIs, XML, JSONs etc. Skills:- Data modeling, OLAP, OLTP, bigquery and Google Cloud Platform (GCP) Show more Show less

SIP Developer India 0 years Not disclosed On-site Full Time

Job Description: We are seeking a highly skilled Telephony Integration Developer with deep expertise in SIP (Session Initiation Protocol) and SIPREC (SIP Recording) to join our growing team. You will be responsible for designing, developing, and integrating telephony systems with a strong emphasis on VoIP communication, call recording, and SIP signaling. Responsibilities: ● Design and implement telephony integrations using SIP and SIPREC. ● Develop APIs and backend services to handle call control, call recording, and session management. ● Work with PBX systems, SIP Servers, and Media Servers for SIP call flows and media capture. ● Integrate third-party VoIP systems with internal applications and platforms. ● Analyze and troubleshoot SIP signaling and RTP media flows. ● Collaborate with cross-functional teams including DevOps, Product, and QA to deliver scalable solutions. ● Create technical documentation, diagrams, and support material. ● Ensure systems are secure, resilient, and scalable. Must-Have Skills: ● Strong experience with SIP protocol (INVITE, ACK, BYE, REGISTER, REFER OPTIONS, etc.) ● Practical experience with SIPREC for recording VoIP calls. ● Solid development skills in JavaScript (Node.js). ● Experience working with SIP Servers (e.g., FreeSWITCH, Asterisk, Kamailio, OpenSIPS). ● Hands-on knowledge of WebRTC, RTP streams, and VoIP media handling. ● Experience building and consuming RESTful APIs. ● Familiarity with call flows, SIP traces analysis (using Wireshark, sngrep, or similar). ● Strong understanding of networking basics (UDP, TCP, NAT traversal, STUN/TURN). ● Ability to troubleshoot and debug complex telephony and media issues. Good to Have Skills: ● Experience with Media Servers (e.g., Janus, Kurento, Mediasoup). ● Knowledge of Call Recording Systems architecture and compliance standards (PCI-DSS, GDPR). ● Experience with Cloud Telephony Platforms (Twilio, Genesys Cloud, Amazon Chime SDK, etc.). ● Familiarity with Session Border Controllers (SBCs). ● Prior experience with SIP trunking and carrier integrations. ● Exposure to Protocol Buffers or gRPC for real-time messaging. ● Understanding of security practices in VoIP (TLS, SRTP, SIP over WebSockets). ● Knowledge of Docker and Kubernetes for deploying SIP services at scale. ● Sound knowledge of telecom protocols like SIP/ICE/STUN/TURN/SRTP/DTLS/H323/Diameter/Radius ● Shall be thoroughly analytical and fix issues for SBC Portfolio of Products ● Shall be thorough with Linux/RTOS internals and product Architecture is preferred ● Strong Knowledge of TCP/UDP/IP and networking concepts is a must ● Knowledge of IP telephony, SIP, Call Routing Techniques of ARS, AAR on Trunk config environment ● Prior Experience on working with FreeSwitch, Kamailio & RTP Proxy, etc ● Strong understanding of Audio streaming/websockets and their application in real-time communication systems. ● In-depth knowledge of audio codecs and their impact on voice quality and bandwidth utilization. ● Experience with gRPC and Protobuf for building efficient and scalable communication interfaces. ● Extensive experience in large scale product development in Enterprise, webRTC, VoIP, VoLTE based products Base Language/Framework: ● Primary Language: JavaScript (Node.js backend) ● Frameworks/Tools: Express.js, Socket.io (for signaling if needed), Wireshark (for debugging), Sngrep. Show more Show less

Senior dotnet Developer Mumbai Metropolitan Region 5 years INR 8.0 - 16.0 Lacs P.A. On-site Full Time

Key Responsibilities Design, develop, and maintain scalable web applications using .NET Core, .NET Framework, C#, and related technologies. Participate in all phases of the SDLC, including requirements gathering, architecture design, coding, testing, deployment, and support. Build and integrate RESTful APIs, and work with SQL Server, Entity Framework, and modern front-end technologies such as Angular, React, and JavaScript. Conduct thorough code reviews, write unit tests, and ensure adherence to coding standards and best practices. Lead or support .NET Framework to .NET Core migration initiatives, ensuring minimal disruption and optimal performance. Implement and manage CI/CD pipelines using tools like Azure DevOps, Jenkins, or GitLab CI/CD. Containerize applications using Docker and deploy/manage them on orchestration platforms like Kubernetes or GKE. Lead and execute database migration projects, particularly transitioning from SQL Server to PostgreSQL. Manage and optimize Cloud SQL for PostgreSQL, including configuration, tuning, and ongoing maintenance. Leverage Google Cloud Platform (GCP) services such as GKE, Cloud SQL, Cloud Run, and Dataflow to build and maintain cloud-native solutions. Handle schema conversion and data transformation tasks as part of migration and modernization efforts. Required Skills & Experience 5+ years of hands-on experience with C#, .NET Core, and .NET Framework. Proven experience in application modernization and cloud-native development. Strong knowledge of containerization (Docker) and orchestration tools like Kubernetes/GKE. Expertise in implementing and managing CI/CD pipelines. Solid understanding of relational databases and experience in SQL Server to PostgreSQL migrations. Familiarity with cloud infrastructure, especially GCP services relevant to application hosting and data processing. Excellent problem-solving, communication, Skills:- C#, .NET, .NET Compact Framework, SQL, Microsoft Windows Azure, CI/CD, Google Cloud Platform (GCP), React.js and Data-flow analysis

azure data engineer Pune,Maharashtra,India 6 years None Not disclosed On-site Full Time

Data Engineer (Azure) Hybrid work mode - Pune/Bangalore/ Noida Minimum 6 years of experience. Max ctc 16 lpa (Azure) EDW Experience working in loading Star schema data warehouses using framework architectures including experience loading type 2 dimensions. Ingesting data from various sources (Structured and Semi Structured), hands on experience ingesting via APIs to lakehouse architectures. Key Skills: Azure Databricks, Azure Data Factory, Azure Datalake Gen 2 Storage, SQL (expert), Python (intermediate), Azure Cloud Services knowledge, data analysis (SQL), data warehousing,documentation – BRD, FRD, user story creation.

azure data engineer Pune,Maharashtra,India 0 years INR 10.0 - 18.0 Lacs P.A. On-site Full Time

Hybrid work mode (Azure) EDW Experience working in loading Star schema data warehouses using framework architectures including experience loading type 2 dimensions. Ingesting data from various sources (Structured and Semi Structured), hands on experience ingesting via APIs to lakehouse architectures. Key Skills: Azure Databricks, Azure Data Factory, Azure Datalake Gen 2 Storage, SQL (expert), Python (intermediate), Azure Cloud Services knowledge, data analysis (SQL), data warehousing,documentation – BRD, FRD, user story creation. Skills:- Windows Azure, SQL Azure, SQL, Data Warehouse (DWH), Data Analytics, Python, Star schema and Datawarehousing

backend developer Mumbai Metropolitan Region 2 years INR 1.0 - 18.0 Lacs P.A. On-site Full Time

Mumbai malad work from office 6 Days working 1 & 3 Saturday off AWS Expertise: Minimum 2 years of experience working with AWS services like RDS, S3, EC2, and Lambda. Roles And Responsibilities Backend Development: Develop scalable and high-performance APIs and backend systems using Node.js. Write clean, modular, and reusable code following best practices. Debug, test, and optimize backend services for performance and scalability. Database Management: Design and maintain relational databases using MySQL, PostgreSQL, or AWS RDS. Optimize database queries and ensure data integrity. Implement data backup and recovery plans. AWS Cloud Services: Deploy, manage, and monitor applications using AWS infrastructure. Work with AWS services including RDS, S3, EC2, Lambda, API Gateway, and CloudWatch. Implement security best practices for AWS environments (IAM policies, encryption, etc.). Integration and Microservices:Integrate third-party APIs and services. Develop and manage microservices architecture for modular application development. Version Control and Collaboration: Use Git for code versioning and maintain repositories. Collaborate with front-end developers and project managers for end-to-end project delivery. Troubleshooting and Debugging: Analyze and resolve technical issues and bugs. Provide maintenance and support for existing backend systems. DevOps and CI/CD: Set up and maintain CI/CD pipelines. Automate deployment processes and ensure zero-downtime releases. Agile Development: Participate in Agile/Scrum ceremonies such as daily stand-ups, sprint planning, and retrospectives. Deliver tasks within defined timelines while maintaining high quality. Required Skills Strong proficiency in Node.js and JavaScript/TypeScript. Expertise in working with relational databases like MySQL/PostgreSQL and AWS RDS. Proficient with AWS services including Lambda, S3, EC2, and API Gateway. Experience with RESTful API design and GraphQL (optional). Knowledge of containerization using Docker is a plus. Strong problem-solving and debugging skills. Familiarity with tools like Git, Jenkins, and Jira. Skills:- Amazon Web Services (AWS), AWS RDS, Amazon S3, Amazon EC2, AWS Lambda, NodeJS (Node.js), Fullstack Developer, Jenkins, Python and SQL

Machine Learning Engineer bengaluru,karnataka,india 3 years None Not disclosed On-site Full Time

Machine Learning Engineer Bangalore - 3 days work from office MAX 18.50 LPA Responsibilities ● Design, develop, and implement machine learning models and algorithms to solve complex business problems. ● Collaborate with data scientists to transition models from research and development to production-ready systems. ● Build and maintain scalable data pipelines for ML model training and inference using Databricks. ● Implement and manage the ML model lifecycle using MLflow for experiment tracking, model versioning, and model registry. ● Deploy and manage ML models in production environments on Azure, leveraging services like Azure Machine Learning, Azure Kubernetes Service (AKS), or Azure Functions. ● Support MLOps workloads by automating model training, evaluation, deployment, and monitoring processes. ● Ensure the reliability, performance, and scalability of ML systems in production. ● Monitor model performance, detect drift, and implement retraining strategies. ● Collaborate with DevOps and Data Engineering teams to integrate ML solutions into existing infrastructure and CI/CD pipelines. ● Document model architecture, data flows, and operational procedures. Qualifications ● Education: Bachelor’s or Master’s Degree in Computer Science, Engineering, Statistics, or a related quantitative field. ● Experience: Minimum 3+ years of professional experience as an ML Engineer or in a similar role. Skills: ● Strong proficiency in Python programming for data manipulation, machine learning, and scripting. ● Hands-on experience with machine learning frameworks such as Scikit-learn, TensorFlow, PyTorch, or Keras. ● Demonstrated experience with MLflow for experiment tracking, model management, and model deployment. ● Proven experience working with Microsoft Azure cloud services, specifically Azure Machine Learning, Azure Databricks, and related compute/storage services. ● Solid experience with Databricks for data processing, ETL, and ML model development. ● Understanding of MLOps principles and practices, including CI/CD for ML, model versioning, monitoring, and retraining. ● Experience with containerization technologies (Docker) and orchestration (Kubernetes, especially AKS) for deploying ML models. ● Familiarity with data warehousing concepts and SQL. ● Ability to work with large datasets and distributed computing frameworks. ● Strong problem-solving skills and attention to detail. ● Excellent communication and collaboration skills. Nice-to-Have Skills: ● Experience with other cloud platforms (AWS, GCP). ● Knowledge of big data technologies like Apache Spark. ● Experience with Azure DevOps for CI/CD pipelines. ● Familiarity with real-time inference patterns and streaming data. ● Understanding of responsible AI principles (fairness, explainability, privacy). Certifications: ● Microsoft Certified: Azure AI Engineer Associate ● Databricks Certified Machine Learning Associate (or higher)