Jobs
Interviews
5 Job openings at Cymetrix
GCP Data Modeller

Chennai, Tamil Nadu, India

5 years

Not disclosed

On-site

Full Time

Bangalore / Chennai Hands-on data modelling for OLTP and OLAP systems In-depth knowledge of Conceptual, Logical and Physical data modelling Strong understanding of Indexing, partitioning, data sharding, with practical experience of having done the same Strong understanding of variables impacting database performance for near-real-time reporting and application interaction. Should have working experience on at least one data modelling tool, preferably DBSchema, Erwin Good understanding of GCP databases like AlloyDB, CloudSQL, and BigQuery. People with functional knowledge of the mutual fund industry will be a plus Role & Responsibilities Work with business users and other stakeholders to understand business processes. Ability to design and implement Dimensional and Fact tables Identify and implement data transformation/cleansing requirements Develop a highly scalable, reliable, and high-performance data processing pipeline to extract, transform and load data from various systems to the Enterprise Data Warehouse Develop conceptual, logical, and physical data models with associated metadata including data lineage and technical data definitions Design, develop and maintain ETL workflows and mappings using the appropriate data load technique Provide research, high-level design, and estimates for data transformation and data integration from source applications to end-user BI solutions. Provide production support of ETL processes to ensure timely completion and availability of data in the data warehouse for reporting use. Analyze and resolve problems and provide technical assistance as necessary. Partner with the BI team to evaluate, design, develop BI reports and dashboards according to functional specifications while maintaining data integrity and data quality. Work collaboratively with key stakeholders to translate business information needs into well-defined data requirements to implement the BI solutions. Leverage transactional information, data from ERP, CRM, HRIS applications to model, extract and transform into reporting & analytics. Define and document the use of BI through user experience/use cases, prototypes, test, and deploy BI solutions. Develop and support data governance processes, analyze data to identify and articulate trends, patterns, outliers, quality issues, and continuously validate reports, dashboards and suggest improvements. Train business end-users, IT analysts, and developers. Required Skills Bachelor’s degree in Computer Science or similar field or equivalent work experience. 5+ years of experience on Data Warehousing, Data Engineering or Data Integration projects. Expert with data warehousing concepts, strategies, and tools. Strong SQL background. Strong knowledge of relational databases like SQL Server, PostgreSQL, MySQL. Strong experience in GCP & Google BigQuery, Cloud SQL, Composer (Airflow), Dataflow, Dataproc, Cloud Function and GCS Good to have knowledge on SQL Server Reporting Services (SSRS), and SQL Server Integration Services (SSIS). Knowledge of AWS and Azure Cloud is a plus. Experience in Informatica Power exchange for Mainframe, Salesforce, and other new-age data sources. Experience in integration using APIs, XML, JSONs etc. Skills:- Data modeling, OLAP, OLTP, bigquery and Google Cloud Platform (GCP) Show more Show less

SIP Developer

India

0 years

Not disclosed

On-site

Full Time

Job Description: We are seeking a highly skilled Telephony Integration Developer with deep expertise in SIP (Session Initiation Protocol) and SIPREC (SIP Recording) to join our growing team. You will be responsible for designing, developing, and integrating telephony systems with a strong emphasis on VoIP communication, call recording, and SIP signaling. Responsibilities: ● Design and implement telephony integrations using SIP and SIPREC. ● Develop APIs and backend services to handle call control, call recording, and session management. ● Work with PBX systems, SIP Servers, and Media Servers for SIP call flows and media capture. ● Integrate third-party VoIP systems with internal applications and platforms. ● Analyze and troubleshoot SIP signaling and RTP media flows. ● Collaborate with cross-functional teams including DevOps, Product, and QA to deliver scalable solutions. ● Create technical documentation, diagrams, and support material. ● Ensure systems are secure, resilient, and scalable. Must-Have Skills: ● Strong experience with SIP protocol (INVITE, ACK, BYE, REGISTER, REFER OPTIONS, etc.) ● Practical experience with SIPREC for recording VoIP calls. ● Solid development skills in JavaScript (Node.js). ● Experience working with SIP Servers (e.g., FreeSWITCH, Asterisk, Kamailio, OpenSIPS). ● Hands-on knowledge of WebRTC, RTP streams, and VoIP media handling. ● Experience building and consuming RESTful APIs. ● Familiarity with call flows, SIP traces analysis (using Wireshark, sngrep, or similar). ● Strong understanding of networking basics (UDP, TCP, NAT traversal, STUN/TURN). ● Ability to troubleshoot and debug complex telephony and media issues. Good to Have Skills: ● Experience with Media Servers (e.g., Janus, Kurento, Mediasoup). ● Knowledge of Call Recording Systems architecture and compliance standards (PCI-DSS, GDPR). ● Experience with Cloud Telephony Platforms (Twilio, Genesys Cloud, Amazon Chime SDK, etc.). ● Familiarity with Session Border Controllers (SBCs). ● Prior experience with SIP trunking and carrier integrations. ● Exposure to Protocol Buffers or gRPC for real-time messaging. ● Understanding of security practices in VoIP (TLS, SRTP, SIP over WebSockets). ● Knowledge of Docker and Kubernetes for deploying SIP services at scale. ● Sound knowledge of telecom protocols like SIP/ICE/STUN/TURN/SRTP/DTLS/H323/Diameter/Radius ● Shall be thoroughly analytical and fix issues for SBC Portfolio of Products ● Shall be thorough with Linux/RTOS internals and product Architecture is preferred ● Strong Knowledge of TCP/UDP/IP and networking concepts is a must ● Knowledge of IP telephony, SIP, Call Routing Techniques of ARS, AAR on Trunk config environment ● Prior Experience on working with FreeSwitch, Kamailio & RTP Proxy, etc ● Strong understanding of Audio streaming/websockets and their application in real-time communication systems. ● In-depth knowledge of audio codecs and their impact on voice quality and bandwidth utilization. ● Experience with gRPC and Protobuf for building efficient and scalable communication interfaces. ● Extensive experience in large scale product development in Enterprise, webRTC, VoIP, VoLTE based products Base Language/Framework: ● Primary Language: JavaScript (Node.js backend) ● Frameworks/Tools: Express.js, Socket.io (for signaling if needed), Wireshark (for debugging), Sngrep. Show more Show less

Senior dotnet Developer

Mumbai Metropolitan Region

5 years

INR 8.0 - 16.0 Lacs P.A.

On-site

Full Time

Key Responsibilities Design, develop, and maintain scalable web applications using .NET Core, .NET Framework, C#, and related technologies. Participate in all phases of the SDLC, including requirements gathering, architecture design, coding, testing, deployment, and support. Build and integrate RESTful APIs, and work with SQL Server, Entity Framework, and modern front-end technologies such as Angular, React, and JavaScript. Conduct thorough code reviews, write unit tests, and ensure adherence to coding standards and best practices. Lead or support .NET Framework to .NET Core migration initiatives, ensuring minimal disruption and optimal performance. Implement and manage CI/CD pipelines using tools like Azure DevOps, Jenkins, or GitLab CI/CD. Containerize applications using Docker and deploy/manage them on orchestration platforms like Kubernetes or GKE. Lead and execute database migration projects, particularly transitioning from SQL Server to PostgreSQL. Manage and optimize Cloud SQL for PostgreSQL, including configuration, tuning, and ongoing maintenance. Leverage Google Cloud Platform (GCP) services such as GKE, Cloud SQL, Cloud Run, and Dataflow to build and maintain cloud-native solutions. Handle schema conversion and data transformation tasks as part of migration and modernization efforts. Required Skills & Experience 5+ years of hands-on experience with C#, .NET Core, and .NET Framework. Proven experience in application modernization and cloud-native development. Strong knowledge of containerization (Docker) and orchestration tools like Kubernetes/GKE. Expertise in implementing and managing CI/CD pipelines. Solid understanding of relational databases and experience in SQL Server to PostgreSQL migrations. Familiarity with cloud infrastructure, especially GCP services relevant to application hosting and data processing. Excellent problem-solving, communication, Skills:- C#, .NET, .NET Compact Framework, SQL, Microsoft Windows Azure, CI/CD, Google Cloud Platform (GCP), React.js and Data-flow analysis

azure data engineer

Pune, Maharashtra, India

6 years

None Not disclosed

On-site

Full Time

Data Engineer (Azure) Hybrid work mode - Pune/Bangalore/ Noida Minimum 6 years of experience. Max ctc 16 lpa (Azure) EDW Experience working in loading Star schema data warehouses using framework architectures including experience loading type 2 dimensions. Ingesting data from various sources (Structured and Semi Structured), hands on experience ingesting via APIs to lakehouse architectures. Key Skills: Azure Databricks, Azure Data Factory, Azure Datalake Gen 2 Storage, SQL (expert), Python (intermediate), Azure Cloud Services knowledge, data analysis (SQL), data warehousing,documentation – BRD, FRD, user story creation.

azure data engineer

Pune, Maharashtra, India

0 years

INR 10.0 - 18.0 Lacs P.A.

On-site

Full Time

Hybrid work mode (Azure) EDW Experience working in loading Star schema data warehouses using framework architectures including experience loading type 2 dimensions. Ingesting data from various sources (Structured and Semi Structured), hands on experience ingesting via APIs to lakehouse architectures. Key Skills: Azure Databricks, Azure Data Factory, Azure Datalake Gen 2 Storage, SQL (expert), Python (intermediate), Azure Cloud Services knowledge, data analysis (SQL), data warehousing,documentation – BRD, FRD, user story creation. Skills:- Windows Azure, SQL Azure, SQL, Data Warehouse (DWH), Data Analytics, Python, Star schema and Datawarehousing

Cymetrix logo

Cymetrix

5 Jobs

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Job Titles Overview