Wonder Global Technology Centre

3 Job openings at Wonder Global Technology Centre
Principal Data Architect Hyderabad,Bengaluru 8 - 13 years INR 15.0 - 30.0 Lacs P.A. Hybrid Full Time

Essential Responsibilities: Architecture & Design Define and document the overall data platform architecture in GCP, including ingestion (Pub/Sub, Dataflow), storage (BigQuery, Cloud Storage), and orchestration (Composer, Workflows). Establish data modeling standards (star/snowflake schemas, partitioning, clustering) to optimize performance and cost. Platform Implementation Build scalable, automated ETL/ELT pipelines for IoT telemetry and events. Implement streaming analytics and CDC where required to support real-time dashboards and alerts. Data Products & Exchange Collaborate with data scientists and product managers to package curated datasets and ML feature tables as consumable data products. Architect and enforce a secure, governed data exchange layerleveraging BigQuery Authorized Views, Data Catalog, and IAMto monetize data externally. Cost Management & Optimization Design cost-control measures: table partitioning/clustering, query cost monitoring, budget alerts, and committed-use discounts. Continuously analyze query performance and storage utilization to drive down TCO. Governance & Security Define and enforce data governance policies (cataloging, lineage, access controls) using Cloud Data Catalog and Cloud IAM. Ensure compliance with privacy, security, and regulatory requirements for internal and external data sharing. Stakeholder Enablement Partner with business stakeholders to understand data needs and translate them into platform capabilities and SLAs. Provide documentation, training, and self-service tooling (Data Studio templates, APIs, notebooks) to democratize data access. Mentorship & Leadership Coach and mentor engineers on big data best practices, SQL optimization, and cloud-native architecture patterns. Lead architecture reviews, proof-of-concepts, and pilot projects to evaluate emerging technologies (e.g., BigQuery Omni, Vertex AI). What You'll Bring to Our Team Minimum Qualifications Bachelor’s degree in Computer Science, Engineering, or related field. 8+ years designing and operating large-scale data platforms, with at least 5 years hands-on experience in GCP (BigQuery, Dataflow, Pub/Sub). Deep expertise in BigQuery performance tuning, data partitioning/clustering, and cost-control techniques. Proven track record building streaming and batch pipelines (Apache Beam, Dataflow, Spark). Strong SQL skills and experience with data modeling for analytics. Familiarity with data governance tools: Data Catalog, IAM, VPC Service Controls. Experience with Python or Java for ETL/ELT development. Excellent communication skills, able to translate technical solutions for non-technical stakeholders.

Principal Data Architect Hyderabad,Bengaluru 8 - 12 years INR 20.0 - 35.0 Lacs P.A. Hybrid Full Time

Essential Responsibilities: Architecture & Design Define and document the overall data platform architecture in GCP, including ingestion (Pub/Sub, Dataflow), storage (BigQuery, Cloud Storage), and orchestration (Composer, Workflows). Establish data modeling standards (star/snowflake schemas, partitioning, clustering) to optimize performance and cost. Platform Implementation Build scalable, automated ETL/ELT pipelines for IoT telemetry and events. Implement streaming analytics and CDC where required to support real-time dashboards and alerts. Data Products & Exchange Collaborate with data scientists and product managers to package curated datasets and ML feature tables as consumable data products. Architect and enforce a secure, governed data exchange layerleveraging BigQuery Authorized Views, Data Catalog, and IAM—to monetize data externally. Cost Management & Optimization Design cost-control measures: table partitioning/clustering, query cost monitoring, budget alerts, and committed-use discounts. Continuously analyze query performance and storage utilization to drive down TCO. Governance & Security Define and enforce data governance policies (cataloging, lineage, access controls) using Cloud Data Catalog and Cloud IAM. Ensure compliance with privacy, security, and regulatory requirements for internal and external data sharing. Stakeholder Enablement Partner with business stakeholders to understand data needs and translate them into platform capabilities and SLAs. Provide documentation, training, and self-service tooling (Data Studio templates, APIs, notebooks) to democratize data access. Mentorship & Leadership Coach and mentor engineers on big data best practices, SQL optimization, and cloud-native architecture patterns. Lead architecture reviews, proof-of-concepts, and pilot projects to evaluate emerging technologies (e.g., BigQuery Omni, Vertex AI). Additional Job DescriptionAdditional Job DescriptionMinimum Qualifications Bachelor’s degree in Computer Science, Engineering, or related field. 8+ years designing and operating large-scale data platforms, with at least 5 years hands-on experience in GCP (BigQuery, Dataflow, Pub/Sub). Deep expertise in BigQuery performance tuning, data partitioning/clustering, and cost-control techniques. Proven track record building streaming and batch pipelines (Apache Beam, Dataflow, Spark). Strong SQL skills and experience with data modeling for analytics. Familiarity with data governance tools: Data Catalog, IAM, VPC Service Controls. Experience with Python or Java for ETL/ELT development. Excellent communication skills, able to translate technical solutions for non-technical stakeholders.

Salesforce Developer hyderabad,bengaluru 6 - 8 years INR 5.0 - 13.0 Lacs P.A. Hybrid Full Time

Essential Duties & Responsibilities: Design and Develop Custom Salesforce Applications and Integrations: Build scalable, secure, and performant solutions using Apex, Lightning Web Components (LWC), and Flows. Integrate Salesforce with Oracle ERP, MuleSoft, and third-party platforms. Translate business requirements into technical design. Deliver and Support Multi-Cloud CRM Capabilities: Configure and enhance functionality in Sales Cloud (Opportunities, Forecasting), Service Cloud (Case Mgmt, FSL), Manufacturing Cloud (Warranty Claim Management, Sales Agreements, Forecasts), and Marketing Cloud (Journeys, Email Studio). Build data models and automation flows to drive digital outcomes. Implement DevOps and Testing Practices: Utilize tools such as Git, Copado, and Salesforce DX for source control and deployment. Create unit and integration tests to ensure system quality and performance. Perform code reviews and participate in Agile sprint ceremonies. Other related duties as assigned: Participate in Salesforce release evaluations and feature adoption. Provide production support and troubleshoot issues. Collaborate with Enterprise Architects to align with solution design standards. Additional Job DescriptionAdditional Job Description Minimum Qualifications: Bachelors degree in Computer Science, Information Technology, or a related field. Minimum 4 years of hands-on Salesforce development experience. Proficiency in Apex, LWC, SOQL, Flow Builder, and REST/SOAP APIs. Experience with CI/CD tools and agile methodologies. Experience with integrations to Oracle ERP and/or MuleSoft. Preferred Qualifications: Salesforce certifications: Platform Developer I/II, Service Cloud Consultant, Marketing Cloud Developer, Commerce Cloud Developer, Manufacturing Cloud Accredited Professional. Experience with Salesforce Shield, Einstein GPT/AI features, or Copado. Knowledge of Oracle SCM or EBS systems.