Jobs
Interviews

Wavicle Data Solutions

At Wavicle we provide advanced internal communications solutions through the use of cloud-based, mobile-first technology. Our customers depend upon Wavicle to create team alignment to vision. Whether talking to a select few or a big team, together or dispersed. Wavicle will deliver your most important, core messages with velocity and impact to inspire teams and drive measurable performance. We meet and exceed existing realities that our enterprise customers face including: cloud, mobility, BYOD, and gamification. We strive to express technological advancement as ease-of-use. Our Wavicle Platform is a cloud-based, turnkey solution for game-based learning course authoring, delivery, and analytics. Our easy-to-use platform renders your content into multimedia game-based learning courses, delivers them to smartphones, tablets, and browsers, then returns a robust dashboard of analytics back to the administrator in real-time. We rely upon the vast wealth of research proving game-based, mobile, and blended learning’s effectiveness in driving higher retention rates on knowledge transfer – thus creating happier learners and more productive organizations overall.

14 Job openings at Wavicle Data Solutions
Data Architect Tamil Nadu,India 11 years Not disclosed Remote Full Time

Job Title: Data Architect – AWS & GCP Experience: 11+ Years Location: Remote / Hybrid Job Type: Full-timeAbout the Role:We are seeking an experienced Data Architect with deep expertise in AWS and Google Cloud Platform (GCP). The ideal candidate will have a proven track record in designing and implementing scalable, secure, and high-performance data architectures across cloud platforms. You will play a key role in shaping our cloud data strategy, ensuring data quality, and enabling analytics at scale.Key Responsibilities:Design and implement end-to-end data architecture and pipelines on AWS and GCP.Define data models, data flow, and data storage strategies aligned with business needs.Architect and implement data lakes, data warehouses, and real-time streaming platforms.Develop and maintain technical documentation, data dictionaries, and architecture blueprints.Work closely with business stakeholders, data scientists, and engineering teams to define data strategies and solutions.Ensure data quality, security, governance, and compliance across platforms.Lead cloud data migration and modernization projects.Evaluate new tools and technologies to improve data architecture performance and reliability.Collaborate with DevOps and Security teams to enforce best practices in deployment and access control.Mentor junior team members and provide architectural oversight.Required Skills & Experience:11+ years of IT experience with at least 5+ years in data architecture.Strong hands-on experience with AWS (Redshift, S3, Glue, RDS, Lambda, etc.) and GCP (BigQuery, Dataflow, Cloud Storage, Cloud Composer, etc.).Experience in data modeling, ETL/ELT pipelines, data integration, and data warehouse design.Expertise with SQL, Python, Spark, and other data processing frameworks.Strong experience with data governance, security, and compliance on cloud platforms.Proficiency in Infrastructure as Code (IaC) using tools like Terraform or CloudFormation.Familiarity with streaming data tools (Kafka, Pub/Sub, etc.).Experience in real-time analytics, batch processing, and big data ecosystems.Understanding of CI/CD practices, DevOps tools, and agile methodologies.

Senior Data Engineer Chennai,Coimbatore,Bengaluru 6 - 10 years INR 15.0 - 25.0 Lacs P.A. Work from Office Full Time

Hi Professionals, We are looking for Senior Data Engineer for Permanent Role Experience : 6 to 10 Years Work Location: Hybrid Chennai, Coimbatore, Bangalore Notice Period: 0 TO 15 Days or Immediate Joiner Skills: 1.Python 2.PySpark. 3.AWS Or GCP Interested can send your resume to gowtham.veerasamy@wavicledata.com

Senior Data Engineer (AWS & GCP) Tamil Nadu,India 0 years Not disclosed On-site Full Time

Job Title: Senior Data Engineer (AWS & GCP) Experience: 6+ Years We are seeking an experienced Senior Data Engineer with expertise in AWS and preferably GCP to join our data engineering team. The ideal candidate will be skilled in building, optimizing, and managing data pipelines and infrastructure in cloud environments. You’ll work closely with cross-functional teams including data scientists, analysts, and architects to ensure efficient and secure data operations. Key Responsibilities: Design, develop, and maintain robust ETL/ELT pipelines using AWS services like Glue, Lambda, EMR and/or GCP equivalents such as Dataflow, Cloud Functions, and BigQuery. Build scalable and efficient data storage and warehousing solutions using AWS S3, Redshift, RDS or GCP Cloud Storage, BigQuery, and Cloud SQL . Optimize data architecture for performance and cost across cloud platforms. Implement and manage data governance, security policies, and access controls using IAM and cloud-native tools. Collaborate with analytics and business intelligence teams to ensure data availability and reliability. Monitor and manage cloud costs, resource utilization, and performance. Troubleshoot and resolve issues related to data ingestion, transformation, and performance bottlenecks. Qualifications: 6+ years of experience in data engineering with at least 4+ years on AWS and familiarity or hands-on experience with GCP (preferred). Proficiency in Python , SQL , and data modeling best practices. Strong experience with ETL tools, data pipelines, and cloud-native services. Working knowledge of data warehousing , distributed computing , and data lakes . Experience with Infrastructure-as-Code tools like Terraform or CloudFormation (a plus). AWS Certification required; GCP Certification is a plus. Strong problem-solving skills and ability to work in a fast-paced environment. Show more Show less

Senior Data Engineer Tamil Nadu,India 6 years Not disclosed On-site Full Time

We are seeking a highly skilled Senior Azure Databricks Data Engineer to design, develop, and optimize data solutions on Azure . The ideal candidate will have expertise in Azure Data Factory (ADF), Databricks, SQL, Python , and experience working with SAP IS-Auto as a data source . This role involves data modeling, systematic layer modeling, and ETL/ELT pipeline development to enable efficient data processing and analytics. Experience: 6+ years Key Responsibilities: Develop & Optimize ETL Pipelines : Build robust and scalable data pipelines using ADF, Databricks, and Python for data ingestion, transformation, and loading. Data Modeling & Systematic Layer Modeling : Design logical, physical, and systematic data models for structured and unstructured data. Integrate SAP IS-Auto : Extract, transform, and load data from SAP IS-Auto into Azure-based data platforms. Database Management : Develop and optimize SQL queries, stored procedures, and indexing strategies to enhance performance. Big Data Processing : Work with Azure Databricks for distributed computing, Spark for large-scale processing, and Delta Lake for optimized storage . Data Quality & Governance : Implement data validation, lineage tracking, and security measures for high-quality, compliant data. Collaboration : Work closely with business analysts, data scientists, and DevOps teams to ensure data availability and usability. Required Skills: Azure Cloud Expertise : Strong experience in Azure Data Factory (ADF), Databricks, and Azure Synapse . Programming : Proficiency in Python for data processing, automation, and scripting. SQL & Database Skills : Advanced knowledge of SQL, T-SQL, or PL/SQL for data manipulation. SAP IS-Auto Data Handling : Experience integrating SAP IS-Auto as a data source into data pipelines. Data Modeling : Hands-on experience in dimensional modeling, systematic layer modeling, and entity-relationship modeling . Big Data Frameworks : Strong understanding of Apache Spark, Delta Lake, and distributed computing . Performance Optimization : Expertise in query optimization, indexing, and performance tuning . Data Governance & Security : Knowledge of RBAC, encryption, and data privacy standards . Preferred Qualifications: Experience with CI/CD for data pipelines using Azure DevOps. Knowledge of Kafka/Event Hub for real-time data processing. Experience with Power BI/Tableau for data visualization (not mandatory but a plus). Show more Show less

Data Engineer Chennai,Coimbatore,Bengaluru 6 - 11 years INR 15.0 - 30.0 Lacs P.A. Work from Office Full Time

Hi Professionals, We are looking for Data Engineer for Permanent Role Work Location: Hybrid Chennai, Coimbatore or Bangalore Experience: 6 to 12 Years Notice Period: 0 TO 15 Days or Immediate Joiner. Skills: 1. Python 2. Pyspark 3. SQL 4. Azure Data bricks 5. AWS Interested can send your resume to gowtham.veerasamy@wavicledata.com.

Data Engineer Chennai,Coimbatore,Bengaluru 6 - 11 years INR 15.0 - 25.0 Lacs P.A. Work from Office Full Time

Hi Professionals, We are looking for Data Engineer for Permanent Role Work Location: Hybrid Chennai, Coimbatore or Bangalore Experience: 6 to 11 Years Notice Period: 0 TO 15 Days or Immediate Joiner. Skills: 1. Python 2. Pyspark 3. SQL 4. Azure Data bricks 5. AWS Interested can send your resume to gowtham.veerasamy@wavicledata.com.

AWS Quicksight Developer Chennai,Coimbatore,Bengaluru 6 - 11 years INR 12.0 - 20.0 Lacs P.A. Work from Office Full Time

Job Summary: We are seeking an experienced Amazon QuickSight Developer to design and develop interactive dashboards, business intelligence (BI) reports, and data visualizations. The ideal candidate will have hands-on experience with Amazon QuickSight, a strong background in data analytics, and the ability to work closely with stakeholders to transform business requirements into actionable insights. Key Responsibilities: Design, develop, and maintain BI dashboards, reports, and visualizations using Amazon QuickSight. Integrate QuickSight with AWS data services like Amazon Redshift, Athena, S3, and RDS. Optimize dashboards and visualizations for performance, usability, and scalability. Gather, analyze, and translate business requirements into technical specifications for BI solutions. Implement security settings, row-level security, and user access controls in QuickSight. Collaborate with cross-functional teams including data engineers, data scientists, and business analysts. Interested can send your resume to gowtham.veerasamy@wavicledata.com.

Senior Data Engineer Chennai,Coimbatore,Bengaluru 6 - 11 years INR 15.0 - 25.0 Lacs P.A. Work from Office Full Time

Hi Professionals, We are looking for Senior Data Engineer for Permanent Role Work Location: Hybrid Chennai, Coimbatore or Bangalore Experience: 6 to 12 Years Notice Period: 0 TO 15 Days or Immediate Joiner. Skills: 1. Python 2. Pyspark 3. SQL 4. AWS 5. GCP 6. MLOps Interested can send your resume to gowtham.veerasamy@wavicledata.com.

Business Intelligence Architect Chennai,Coimbatore,Bengaluru 13 - 17 years INR 20.0 - 30.0 Lacs P.A. Work from Office Full Time

Hi Professionals, We are looking for BI Architect for Permanent Role Work Location: Hybrid Chennai, Coimbatore or Bangalore Experience: 13 to 17 Years Notice Period: 0 TO 15 Days or Immediate Joiner. Skills: 1.Develop and lead the BI strategy, acting as a key decision-maker for BI initiatives. 2.Expertise in BI tools such as Power BI or Tableau. 3.Evaluate, recommend, and implement BI tools based on project requirements. 4.Proficiency in cloud platforms, leveraging cloud services for BI implementations. 5.AWS or Azure certifications are a plus. 6.Excellent communication skills to effectively interact with customers and stakeholders. 7.Ability to convey complex BI concepts in a clear and understandable manner. 8.Proficient in programming languages (e.g., Python, Java) and scripting languages for BI customization and automation. 9.Knowledge of ETL processes, data warehousing, and data modeling. 10.Collaborate with Data Engineers and Architects to ensure data integrity and optimal performance. 11.Interact with customers, understanding their BI needs and providing tailored solutions. 12.Prior experience working with customers in the US or UK is preferred. Interested can send your resume to gowtham.veerasamy@wavicledata.com

Java Developer Bengaluru 8 - 12 years INR 25.0 - 30.0 Lacs P.A. Work from Office Full Time

Hi Professionals, We are looking for Java Developer in Bangalore for Permanent Role Work Location: Bangalore Experience: 8 to 13 Years Notice Period: 0 TO 15 Days or Immediate Joiner. Skills: 1.Java 2.Multi-threading 3.Message queues 4.PostgreSQL 5.Azure Note: Candidates should be based in or around the Bangalore area and must be willing to visit the client site once or twice a week. Interested can send your resume to gowtham.veerasamy@wavicledata.com

Senior GCP DevOps Engineer Tamil Nadu,India 8 years None Not disclosed On-site Full Time

Job Summary: We are seeking a skilled DevOps Engineer over 8+ years of experience with a strong foundation in Google Cloud Platform (GCP) and hands-on experience in AWS and Azure. The ideal candidate will be responsible for building, automating, and maintaining cloud-native and hybrid infrastructure, CI/CD pipelines, and cloud operations with a focus on scalability, security, and reliability. Key Responsibilities: Build and manage scalable and secure infrastructure using Compute Engine, Cloud Run, Cloud Functions, and Cloud Load Balancing. Lead infrastructure automation using Terraform and Cloud Deployment Manager to ensure consistent and version-controlled infrastructure. Design and improve CI/CD pipelines using Cloud Build, GitLab CI/CD, Jenkins, or similar tools, and integrate them with Artifact Registry. Set up monitoring, logging, and alerts using the Cloud Operations Suite (Cloud Monitoring, Cloud Logging, Cloud Trace, Cloud Profiler), and integrate with Prometheus, Grafana, or Datadog. Configure and secure network architecture using VPC, subnets, Cloud NAT, Cloud VPN, Interconnect, firewall rules, and Private Service Connect to support hybrid and multi-cloud environments. Apply security best practices across GCP using IAM policies, VPC Service Controls, Cloud Armor, Binary Authorization, and Secret Manager. Manage and optimize cloud databases such as Cloud SQL, Cloud Spanner, Firestore and Bigtable for performance, availability, and scalability. Lead production support efforts, including incident management, root cause analysis, and post-mortems, to improve reliability and reduce downtime. Provide strategic guidance on cost management, performance tuning, and cloud governance for large-scale environments on Google Cloud Platform. Required Skills & Qualifications: Strong hands-on experience with core GCP services such as Compute Engine, VPC, IAM, Cloud Storage, and Cloud SQL Experience with serverless, and observability tools including Cloud Run, Cloud Functions, Cloud Monitoring, and Cloud Logging Proficient in managing GCP database services such as Cloud SQL, Cloud Spanner, Firestore, and Bigtable Advanced CI/CD pipeline development using Cloud Build, Cloud Deploy, GitLab CI/CD, or Jenkins Solid experience with Kubernetes, Docker, Helm, and container orchestration using GKE Proficient in Infrastructure as Code (IaC) using Terraform or Cloud Deployment Manager Scripting skills in Python, Bash, or Powershell Strong understanding of Git, branching strategies, and version control workflows Experience deploying microservices architectures in Agile and DevSecOps environments Nice to Have: GCP certification. Experience with multi cloud environment. Knowledge of cost optimization and budgeting in cloud. Security and compliance best practices for cloud infrastructure.

Senior MLOps Engineer Tamil Nadu,India 0 years None Not disclosed On-site Full Time

We are looking for a seasoned Senior MLOps Engineer to join our Data Science team. The ideal candidate will have a strong background in Python development, machine learning operations, and cloud technologies. You will be responsible for operationalizing ML/DL models and managing the end-to-end machine learning lifecycle from model development to deployment and monitoring while ensuring high-quality and scalable solutions. Mandatory Skills: Python Programming: Expert in OOPs concepts and testing frameworks (e.g., PyTest) Strong experience with ML/DL libraries (e.g., Scikit-learn, TensorFlow, PyTorch, Prophet, NumPy, Pandas) MLOps & DevOps: Proven experience in executing data science projects with MLOps implementation CI/CD pipeline design and implementation Docker (Mandatory) Experience with ML lifecycle tracking tools such as MLflow, Weights & Biases (W&B), or cloud-based ML monitoring tools Experience in version control (Git) and infrastructure-as-code (Terraform or CloudFormation) Familiarity with code linting, test coverage, and quality tools such as SonarQube Cloud & Orchestration: Hands-on experience with AWS SageMaker or GCP Vertex AI Proficiency with orchestration tools like Apache Airflow or Astronomer Strong understanding of cloud technologies (AWS or GCP) Software Engineering: Experience in building backend APIs using Flask, FastAPI, or Django Familiarity with distributed systems for model training and inference Experience working with Feature Stores Deep understanding of the ML/DL lifecycle from ideation, experimentation, deployment to model sunsetting Understanding of software development best practices, including automated testing and CI/CD integration Agile Practices: Proficient in working within a Scrum/Agile environment using tools like JIRA Cross-Functional Collaboration: Ability to collaborate effectively with product managers, domain experts, and business stakeholders to align ML initiatives with business goals Preferred Skills: Experience building ML solutions for: (Any One) Sales Forecasting Marketing Mix Modelling Demand Forecasting Certified in machine learning or cloud platforms (e.g., AWS or GCP) Strong communication and documentation skills

Senior/Lead Java Developer coimbatore,tamil nadu,india 0 years None Not disclosed Remote Contractual

Company Description At Wavicle Data Solutions, we empower enterprises by unlocking the potential of their data, allowing them to gain actionable insights and make confident decisions. Through our Wavicle Intelligent Transformation (WIT), we transform businesses into DataAI enterprises, harnessing data, analytics, and AI for industry advantage. Serving sectors such as QSR, retail, healthcare, and manufacturing, we offer tailored solutions in data strategy, cloud migration, AI analytics, and business intelligence. As partners of AWS, Databricks, Google Cloud, and Microsoft Azure, we drive data modernization and ensure robust data governance. Role Description This is a contract role for a Senior/Lead Java Developer. The role is hybrid, located in Coimbatore with some work from home allowed. The Senior/Lead Java Developer will be responsible for back-end web development, software development, and programming. They will also employ object-oriented programming techniques, collaborate with cross-functional teams, and ensure software solutions meet business needs and performance standards. Qualifications Strong skills in Java programming, Object-Oriented Programming (OOP), and software development What we’re looking for: ✔ Strong expertise in Java (8 and above) ✔ Solid experience with multithreading and message queues ✔ Proficiency in PostgreSQL ✔ Exposure to Azure (cloud-based deployments, services, or solutions) ✔ Experience in a Senior or Lead role with a track record of mentoring and team leadership

Senior/Lead Java Developer coimbatore,tamil nadu 5 - 9 years INR Not disclosed On-site Full Time

As a Senior/Lead Java Developer at Wavicle Data Solutions, you will play a crucial role in back-end web development, software development, and programming. Your expertise in Java programming, Object-Oriented Programming (OOP), and software development will be essential in ensuring that software solutions meet business needs and performance standards. You will leverage your skills in Java (8 and above) to develop robust solutions, along with experience in multithreading and message queues to optimize performance. Proficiency in PostgreSQL will be required, along with exposure to Azure for cloud-based deployments, services, or solutions. As a Senior/Lead Java Developer, you will be expected to take on a mentoring and team leadership role, drawing on your experience in similar positions to guide and support your colleagues. This hybrid role based in Coimbatore offers the flexibility of some work from home, allowing you to contribute effectively to the team while balancing personal commitments. Join us at Wavicle Data Solutions and be part of our mission to transform businesses into DataAI enterprises, leveraging data, analytics, and AI for industry advantage.,

FIND ON MAP

Wavicle Data Solutions