We are seeking a highly skilled Technical Architect Product Engineering with deep expertise in designing scalable, intelligent systems powered by modern software and deep learning architectures. This hybrid role demands strong system design skills across full-stack software engineering, cloud-native infrastructure, and AI/ML – especially in building, deploying, and optimizing deep learning models for real-world applications. You will play a key role in shaping the architectural direction of a AI product Platform, integrating classical software design with cutting-edge machine learning and deep learning innovations. Requirements Key Responsibilities: Design end-to-end software solutions covering frontend, backend, middleware, and integration layers. Define and recommend technology stacks, frameworks, platforms, and tools aligned with project and organizational needs. Design and implement microservices architectures with service discovery, fault tolerance, load balancing, and inter-service communication patterns. Architect RESTful APIs, GraphQL endpoints, and event-driven interfaces for seamless internal and external integrations. Leverage cloud platforms (AWS, Azure, GCP) to design scalable and secure infrastructure using compute, storage, networking, and IAM services. Utilize Docker, Kubernetes, and container orchestration for scalable, continuous deployment-ready systems. Design robust data storage solutions: RDBMS: MySQL, PostgreSQL, Oracle NoSQL: MongoDB, Cassandra, DynamoDB Caching: Redis, Memcached Search: Elasticsearch, Solr Implement asynchronous communication using Kafka, RabbitMQ, or ActiveMQ.\ Work with DevOps to build CI/CD pipelines and infrastructure-as-code (Terraform, CloudFormation, Ansible). Conduct performance tuning, caching strategies, and load balancing to meet SLA targets. Embed security best practices: secure coding, OAuth2, JWT, encryption, vulnerability scanning, OWASP compliance, GDPR readiness. Implement monitoring, logging, and alerting using Prometheus, Grafana, ELK Stack, Splunk. Perform regular code and architecture reviews to maintain standards and code quality. Create and maintain architecture diagrams, design documentation, and technical specifications. Lead proofs of concept (PoCs) and drive adoption of emerging technologies and frameworks. AI/ML & Deep Learning Responsibilities: Architect and implement AI/ML pipelines and deploy production-grade models using TensorFlow, PyTorch, Keras, ONNX, or JAX. Deploy models using platforms like SageMaker, Vertex AI, Kubeflow, or Triton Inference Server. Design deep learning systems for NLP, computer vision, forecasting, or speech processing using modern architectures (Transformers, CNNs, LSTMs, GANs). Integrate MLOps practices: model versioning, continuous training, drift detection, and automated deployment. Ensure model explainability (e.g., LIME, SHAP), performance optimization, and compliance with ethical AI standards. Required Skills: Programming: Python, Java, C#, Go, JavaScript/TypeScript Architecture patterns: Microservices, Event-driven, SOA, Layered Cloud: Deep experience with AWS, Azure, or GCP (including serverless and managed services) Containers & orchestration: Docker, Kubernetes, Helm, Istio, Linkerd Data: RDBMS (MySQL, Oracle, PostgreSQL), NoSQL (MongoDB, Cassandra), Redis, Elasticsearch Messaging/Event systems: Kafka, RabbitMQ, ActiveMQ API design & security: REST, GraphQL, Apigee, Kong, OAuth2, JWT, SAML DevOps: CI/CD tools (Jenkins, GitLab CI, CircleCI), IaC (Terraform, CloudFormation) Observability: Prometheus, Grafana, ELK Stack, Splunk AI/ML: TensorFlow, PyTorch, MLflow, SageMaker, Kubeflow, ONNX, Triton
Greetings from Cloud Destinations! Hope this mail finds you well with regards to an opportunity based in Chennai. If you are keen in exploring this opportunity, kindly have a look on below the detailed JD and revert with your updated CV for further interview process. Company Overview: Cloud Destinations is a Silicon Valley-headquartered, well-established technology organization focused on digital transformation, enterprise application development, infrastructure projects, and professional services related to Large Scale Cloud Migrations, Multi-Cloud Operations (AWS, Azure, Google Cloud, VMware, Rack Space, etc.), Dev Ops, RNOC, Security Operations, Datacenter, UC Collaboration, and IoT. We pride ourselves on deep domain skills in retail, health care, financial, travel, and high technology. We have a strong technical and leadership team, with offices located globally. We believe our successes come from our teamwork and mutual respect for each others talents and unique perspectives. Company website: https://clouddestinations.com/ LinkedIn Profile: https://www.linkedin.com/company/cloud-destinations Role: Full Stack Developer Location: Chennai Work From Office (5 days) Experience: 4+ Years Notice Period: Immediate to 30 Days Strong programming knowledge in JavaScript (ES6+), TypeScript , HTML5, and CSS3. Understanding of HTML, CSS, and responsive web design. Familiarity with OOPs concepts and design principles. Knowledge of the fundamentals of SQL for relational and non-SQL database. Experience in Node.JS & React.JS Awareness of AI-assisted development tools (e.g., GitHub Copilot, ChatGPT, etc.). Ability to learn quickly and adapt to new technologies.
Company Description Cloud Destinations, headquartered in Silicon Valley, specializes in digital transformation and enterprise application development for the Healthcare, Retail, and Technology sectors. We excel in cloud migrations, multi-cloud operations, DevOps, security operations, and IoT. Our globally distributed team includes skilled technical experts and effective leaders who thrive on teamwork and respect for diverse viewpoints. For more details, visit our website: www.clouddestinations.com. Role Description This is a full-time, on-site role for a Data Engineer Manager located in Coimbatore . We are looking for a highly skilled Data Engineering Manager to lead and grow our data engineering team, driving the design, development, and maintenance of scalable data pipelines and architecture. In this role, you will work closely with cross-functional teams, including data science, analytics, software engineering, and business stakeholders, to ensure the delivery of high-quality data solutions that support business objectives. If you're passionate about data, cloud technologies, and leading high-performing teams, wed love to hear from you! Team Leadership: Lead, mentor, and manage a team of data engineers, ensuring effective collaboration, professional growth, and high-quality output. Data Architecture: Design, build, and maintain robust data pipelines and architectures that efficiently support data-driven processes and systems. Data Strategy: Develop and execute data engineering strategies in alignment with business goals and evolving data needs. Data Integration: Oversee the integration of data from various sources, including third-party APIs, databases, and streaming platforms. Optimization & Maintenance: Ensure high performance, scalability, and reliability of data processing systems, continuously optimizing workflows and improving data quality. Collaboration: Work closely with data scientists, analysts, and business stakeholders to understand their data needs and deliver solutions accordingly. Cloud Technologies: Oversee the adoption and integration of cloud-based solutions (AWS, Azure, Google Cloud) to manage and process large-scale datasets efficiently. Governance & Security: Implement and maintain best practices for data governance, data privacy, and security in line with industry standards and regulations. Stakeholder Communication: Regularly communicate progress, challenges, and outcomes to stakeholders across the organization, providing insights and solutions as needed. Project Management: Plan, prioritize, and oversee data engineering projects, ensuring timely delivery of solutions that meet both business and technical requirements. Technical Expertise: Strong proficiency in programming languages such as Python, Java, or Scala. Experience with data processing frameworks like Apache Spark, Kafka, Hadoop , or similar technologies. Hands-on experience with ETL tools and techniques, including workflow orchestration (e.g Airflow, Luigi ). Strong SQL skills and familiarity with NoSQL databases (e.g., MongoDB, Cassandra). Proven experience with cloud platforms ( AWS, Google Cloud, Azure ) and related data storage solutions ( S3, Redshift, BigQuery, etc. ). Leadership : Proven experience leading and mentoring data engineering teams, fostering a culture of continuous learning and collaboration. Data Architecture : Expertise in designing scalable and efficient data architectures, pipelines, and workflows. Problem Solving : Strong analytical and problem-solving skills, with the ability to translate business requirements into technical solutions. Communication : Excellent communication skills, capable of collaborating with both technical and non-technical stakeholders. Project Management : Experience managing multiple projects and deadlines in a fast-paced environment. Familiarity with machine learning pipelines and big data platforms. Certification in cloud platforms (AWS Certified Data Analytics, Google Cloud, Professional Data Engineer, etc.). Knowledge of data governance, compliance, and security best practices (e.g., GDPR, CCPA). Interested and suitable applicants kindly share your resume to kavyaga@clouddestinations.com