Jobs
Interviews

Greentick Value Services

6 Job openings at Greentick Value Services
Consultant - Data Engineer (with Fabric) Bengaluru 5 - 10 years INR 25.0 - 30.0 Lacs P.A. Work from Office Full Time

JOB DESCRIPTION We are looking for a highly skilled API & Pixel Tracking Integration Engineer to lead the development and deployment of server- side tracking and attribution solutions across multiple platforms. The ideal candidate brings deep expertise in CAPI integrations (Meta, Google, and other platforms), secure data handling using cryptographic techniques, and experience working within privacy- first environments like Azure Clean Rooms . This role requires strong hands-on experience in C# development, Azure cloud services, OCI (Oracle Cloud Infrastructure) , and marketing technology stacks including Adobe Tag Management and Pixel Management . You will work closely with engineering, analytics, and marketing teams to deliver scalable, compliant, and secure data tracking solutions that drive business insights and performance. Key Responsibilities: Design, implement, and maintain CAPI integrations across Meta, Google, and all major platforms , ensuring real-time and accurate server-side event tracking. Utilize Fabric and OCI environments as needed for data integration and marketing intelligence workflows. Develop and manage custom tracking solutions leveraging Azure Clean Rooms , ensuring user NFAs are respected and privacy-compliant logic is implemented. Implement cryptographic hashing (e.g., SHA-256) Use Azure Data Lake Gen1 & Gen2 (ADLS) , Cosmos DB , and Azure Functions to build and host scalable backend systems. Integrate with Azure Key Vaults to securely manage secrets and sensitive credentials. Design and execute data pipelines in Azure Data Factory (ADF) for processing and transforming tracking data. Lead pixel and tag management initiatives using Adobe Tag Manager , including pixel governance and QA across properties. Collaborate with security teams to ensure all data-sharing and processing complies with Azures data security standards and enterprise privacy frameworks. Monitor, troubleshoot, and optimize existing integrations using logs, diagnostics, and analytics tools. EXPERTISE AND QUALIFICATIONS Required Skills: Strong hands-on experience with Fabric and building scalable APIs. Experience in implementing Meta CAPI , Google Enhanced Conversions , and other platform-specific server-side tracking APIs. Knowledge of Azure Clean Rooms , with experience developing custom logic and code for clean data collaborations . Proficiency with Azure Cloud technologies , especially Cosmos DB, Azure Functions, ADF, Key Vault, ADLS , and Azure security best practices . Familiarity with OCI for hybrid-cloud integration scenarios. Understanding of cryptography and secure data handling (e.g., hashing email addresses with SHA-256). Experience with Adobe Tag Management , specifically in pixel governance and lifecycle. Proven ability to collaborate across functions, especially with marketing and analytics teams. Soft Skills: Strong communication skills to explain technical concepts to non-technical stakeholders. Proven ability to collaborate across teams, especially with marketing, product, and data analytics. Adaptable and proactive in learning and applying evolving technologies and regulatory changes .

Consultant - Software Engineer (with C#) Bengaluru 5 - 10 years INR 25.0 - 30.0 Lacs P.A. Work from Office Full Time

Consultant - Software Engineer (with C#) JOB DESCRIPTION: We are looking for a highly skilled API & Pixel Tracking Integration Engineer to lead the development and deployment of server- side tracking and attribution solutions across multiple platforms. The ideal candidate brings deep expertise in CAPI integrations (Meta, Google, and other platforms), secure data handling using cryptographic techniques, and experience working within privacy- first environments like Azure Clean Rooms . This role requires strong hands-on experience in C# development, Azure cloud services, OCI (Oracle Cloud Infrastructure) , and marketing technology stacks including Adobe Tag Management and Pixel Management . You will work closely with engineering, analytics, and marketing teams to deliver scalable, compliant, and secure data tracking solutions that drive business insights and performance. Key Responsibilities: Design, implement, and maintain CAPI integrations across Meta, Google, and all major platforms , ensuring real-time and accurate server-side event tracking. Develop and manage custom tracking solutions leveraging Azure Clean Rooms , ensuring user NFAs are respected and privacy-compliant logic is implemented. Architect and develop secure REST APIs in C# to support advanced attribution models and marketing analytics pipelines. Implement cryptographic hashing (e.g., SHA-256) Use Azure Data Lake Gen1 & Gen2 (ADLS) , Cosmos DB , and Azure Functions to build and host scalable backend systems. Integrate with Azure Key Vaults to securely manage secrets and sensitive credentials. Design and execute data pipelines in Azure Data Factory (ADF) for processing and transforming tracking data. Lead pixel and tag management initiatives using Adobe Tag Manager , including pixel governance and QA across properties. Collaborate with security teams to ensure all data-sharing and processing complies with Azures data security standards and enterprise privacy frameworks. Utilize Fabric and OCI environments as needed for data integration and marketing intelligence workflows. Monitor, troubleshoot, and optimize existing integrations using logs, diagnostics, and analytics tools. EXPERTISE AND QUALIFICATIONS Required Skills: Strong hands-on experience with C# and building scalable APIs. Experience in implementing Meta CAPI , Google Enhanced Conversions , and other platform-specific server-side tracking APIs. Knowledge of Azure Clean Rooms , with experience developing custom logic and code for clean data collaborations . Proficiency with Azure Cloud technologies , especially Cosmos DB, Azure Functions, ADF, Key Vault, ADLS , and Azure security best practices . Familiarity with OCI for hybrid-cloud integration scenarios. Understanding of cryptography and secure data handling (e.g., hashing email addresses with SHA-256). Role & responsibilities

Inside Sales Executive bengaluru 1 - 3 years INR 4.5 - 6.5 Lacs P.A. Work from Office Full Time

Role & responsibilities We are looking for a good Inside sales & CRM Representative who will be responsible for prospecting, qualifying, fixing demo appointments & audience generation to support the regional sales teams across India. Must be comfortable making calls to generate interest, qualifying prospects & assigning it to sales team. Should be able to identify new prospects from multiple sources including inbound marketing leads, outbound cold calls, events, emails, prospect lists, discovery and individual research. Required Candidate Profile: Cold calling Understand customer needs and requirements and profile them accordingly Generate new leads through data mining & online research Route qualified leads to Sales Engineers for further development and closure Ability to handle inbound and outbound calls Ability to work on quotation when required Maintain and expand database of prospects PAN India Work with all teams to build strong pipeline for sales closures Support team for event calling and registrations when required Help in making report & data segregation Achieve lead and opportunity targets Only female candidates may apply Requirements: Preferably Graduates, preferably in Diploma/Engineering stream or must have worked in similar companies Yr. of Exp. (Yrs): 1-3 years Good communication skills

Lead Big Data Engineer bengaluru 5 - 9 years INR 25.0 - 27.5 Lacs P.A. Remote Full Time

Role & responsibilities : Sound knowledge in Spark architecture and distributed computing and Spark streaming. Proficient in Spark including RDD and Data frames core functions, troubleshooting and performance tuning. Good understanding in object-oriented concepts and hands on experience on Kotlin/Scala/Java with excellent programming logic and technique. Good in functional programming and OOPS concept on Kotlin/Scala/Java Good experience in SQL Managing the team of Associates and Senior Associates and ensuring the utilization is maintained across the project. Able to mentor new members for onboarding to the project. Understand the client requirement and able to design, develop from scratch and deliver. AWS cloud experience would be preferable. Experience in analyzing, re-architecting, and re-platforming on-premises data warehouses to data platforms on cloud (AWS is preferred) Leading the client calls to flag off any delays, blockers, escalations and collate all the requirements. Managing project timing, client expectations and meeting deadlines. Should have played project and team management roles. Facilitate meetings within the team on regular basis. Understand business requirement and analyze different approaches and plan deliverables and milestones for the project. Optimization, maintenance, and support of pipelines. Strong analytical and logical skills. Ability to comfortably tackling new challenges and learn EXPERTISE AND QUALIFICATIONS Experience: 5 to 9 years Must have Skills: Kotlin/Scala/Java Spark SQL Spark Streaming Any cloud (AWS preferable) Kafka /Kinesis/Any streaming services Object-Oriented Programming Hive, ETL/ELT design experience CICD experience (ETL pipeline deployment) Data Modeling experience Good to Have Skills: Git/similar version control tool Knowledge in CI/CD, Microservices

Java Developer (With Spark SQL), Remote bengaluru 4 - 8 years INR 20.0 - 27.5 Lacs P.A. Remote Full Time

Role & responsibilities: Key Responsibilities: Design, develop, and optimize Java-based backend services (Spring Boot / Microservices) for API integrations. Develop and maintain Spark SQL queries and data processing pipelines for large-scale data ingestion. Build Spark batch and streaming jobs to land raw data from multiple vendor APIs into data lakes or warehouses. Implement robust error handling, logging, and monitoring for data pipelines. Collaborate with cross-functional teams across geographies to define integration requirements and deliverables. Troubleshoot and optimize Spark SQL for performance and cost efficiency. Participate in Agile ceremonies, daily standups, and client discussions. EXPERTISE AND QUALIFICATIONS Required Skills: 4 to 8 years of relevant experience. Core Java (Java 8 or above) with proven API development experience. Apache Spark (Core, SQL, DataFrame APIs) for large-scale data processing. Spark SQL strong ability to write and optimize queries for complex joins, aggregations, and transformations. Experience with API integration (RESTful APIs, authentication, payload handling, and rate limiting). Hands-on with data ingestion frameworks and ETL concepts. Experience with MySQL or other RDBMS for relational data management. Proficiency in Git for version control. Strong debugging, performance tuning, and problem-solving skills. Ability to work with minimal supervision in a short-term, delivery-focused engagement. Nice to Have: Experience with Apache Kafka for real-time streaming integrations. Familiarity with AWS data services (S3, EMR, Glue). Exposure to NoSQL databases like Cassandra or MongoDB.

Consultant - Data Engineer bengaluru 3 - 8 years INR 25.0 - 27.5 Lacs P.A. Remote Full Time

Role & responsibilities Responsibilities: Lead and participate in the development of high-quality software solutions for client projects, using modern programming languages and frameworks. Contribute to system architecture and technical design decisions, ensuring that solutions are scalable, secure, and meet client requirements. Work closely with clients to understand their technical needs and business objectives, offering expert advice on software solutions and best practices. Provide guidance and mentorship to junior developers, assisting with code reviews, troubleshooting, and fostering a culture of technical excellence. Work with project managers, business analysts, and other engineers to ensure that technical milestones are achieved, and client expectations are met.Ensure the quality of software through testing, code optimization, and identifying potential issues before deployment. Stay up to date with industry trends, new technologies, and best practices to continuously improve development processes and software quality.Other duties as assigned and directed. EXPERTISE AND QUALIFICATIONS Required Technical Skills: Your data engineering experience should include Azure technologies and be familiar with modern data platform technologies such as Azure Data Factory, Azure Databricks, Azure Synapse, Fabric Understanding of Agile engineering practices Deep familiarity and experience in the following areas: Data warehouse and Lakehouse methodologies, including medallion architecture; Data ETL/ELT processes; Data profiling and anomaly detection; Data modeling (Dimensional/Kimball); SQL Strong background in relational database platforms DevOps/Continuous integration & continuous delivery Work Location: India - Remote Shift Timings: 2:00 pm IST to 11:00 pm IST Preferred candidate profile