We are looking for an experienced Data Architect – Analytics to design and deliver scalable, cloud-based data solutions. You will lead data architecture, build high-performance pipelines, and drive innovation in big data engineering across domains. Strong expertise in distributed systems, cloud platforms, and advanced analytics is essential to transform data into actionable insights. Responsibilities: Front face customer for designing data solutions. Build solutions and systems to manage data and analytics workload. Improve reliability, quality, and time-to-market of our suite of software solutions. Measure and optimize system performance, with an eye toward pushing our capabilities forward, getting ahead of customer needs, and innovating for continual improvement. Provide primary architectural and engineering support for multiple large-scale distributed software applications. Requirements: Bachelor’s degree (or equivalent) in computer science or related discipline. Ability to program (structured and OOP) using one or more high-level languages, such as Python, Java, Scala or .net. 15 years of experience in data engineering. Formative years may be in development projects in Datawarehouse, DataMart’s, ETL, data modelling, data rich middleware’s, and distributed computing solutions. Last 5 years need to be in big data engineering projects involving Hadoop, HBase, Kafka, Hive, Spark, Spark streaming, in-memory database systems, column-oriented database systems and document databases (NoSQL and SQL). Should have programmed extensively in python, Scala, java. Good hands-on experience on open-source kernels and understand distributed compute, distributed storage, serverless and extremely scalable architectures. Should have managed an industry standard program in cloud building data pipelines, data migrations or analytics pipelines. Exposure to databricks, Azure Datafactory, ADLS, AWS Redshift, AWS S3, Glue, Snowflake, Apache airflow is added advantage. Should have worked as software solution provider at the level of solution architect, technology evangelist in medium to large ISV. Direct experience of at least 3 years in design and delivery of bigdata solutions in BFSI vertical. Experience in the OTT/BFSI/Retail domain and familiarity with OTT data constructs would be a big plus. Experience with distributed storage technologies such as NFS, HDFS, Ceph, and Amazon S3, as well as dynamic resource management frameworks (Apache Kafka, Kubernetes, Yarn). Proactive approach to identifying problems, performance bottlenecks, and areas for improvement. Expert understanding of data systems and platform specially, data pipeline, data lake, data model and warehousing. Exposure to Snowflake and data bricks are must. Hands-on experience with Amazon Web Services. Quality Compliance Compliance to Quality and Information Security is critical in ensuring the integrity, confidentiality, availability of data and the consistent delivery of high-quality services are an important aspect of hiring for this position.
As a skilled Software Engineer with expertise in Golang, Kubernetes-based microservices, and cloud platforms, you will play a crucial role in designing, building, and scaling production-grade systems. Your responsibilities will include writing clean and efficient code, troubleshooting complex issues, and delivering high-quality solutions in fast-paced environments. Key Responsibilities: - Build stable and scalable systems deployed in production. - Write clean code following Golang best practices. - Troubleshoot production issues by reviewing source code, logs, operational metrics, stack trace, etc., to identify and resolve specific problems. - Demonstrate data-driven decision making and innovation to solve challenging issues. - Consistently deliver results with high quality in a fast-paced setting. - Collaborate with peers, share knowledge, and contribute to technical decisions. - Write complex queries and scripts, analyze datasets, and efficiently pinpoint issues. - Effectively communicate with global partners and stakeholders. Qualifications Required: - Bachelor's degree with 4-8 years of experience as a software developer with proficiency in Golang. - Experience in building and monitoring global-scale large platform services in non-prod and prod environments. - Ability to collaborate effectively with remote peers across different geographies and time zones. - Excellent written and verbal communication skills with a focus on technical documentation. - Experience with on-call rotation, incident response, and playbooks. - Strong understanding of CS fundamentals and technical knowledge of Kubernetes-based microservice architectures, caching solutions, messaging services, DB services, API gateways, service mesh, and infrastructure-as-code technologies/processes. - Experience with at least one cloud provider (AWS, GCP, Azure, or other). - Ability to implement and interpret metrics and logging using tools like Prometheus, CloudWatch, Kibana, PagerDuty. Please note that Quality and Information Security compliance are crucial aspects of this position to ensure data integrity, confidentiality, availability, and the consistent delivery of high-quality services.,