Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
10.0 - 12.0 years
0 Lacs
bengaluru, karnataka, india
On-site
Who are we: Turbostart is not just a startup fund and accelerator, we are a catalyst for builders and a powerhouse of innovation. Our mission is to propel early-stage startups into the future by providing unparalleled support in technology, marketing, strategy, and beyond. We&aposre in the business of building tomorrow&aposs leaders - today. After 5 Years and 5 Funds we have supported over 50 startups, spanning sectors, stages and geographies - and this is just the beginning! Turbostart spans India, the Middle East, the US as well as Singapore - giving you the opportunity to gain exposure and see the impact of your work ripple across regions. Turbostart has also launched 5 Centers of Excellence across Tech, Marketing, Sales, UI/UX and Investment Banking to support the growth of our startup network. Know more about us on https://turbostart.co/ Our portfolio company, Climaty AI is reshaping digital advertising with sustainability and innovation at the core. Their cutting-edge Agentic AI tool helps brands and agencies optimize campaigns across social, search, and digital platforms, powered by real-time data and automation. Combined with premium programmatic inventory, they offer advertisers scalable, efficient, and impactful ad solutions. Know more about them on https://climaty.ai/ What we are looking for: Role: Head of Engineering Experience Required: 10+ years Location: Bangalore What youll do: Technology Leadership Own the technical vision and architecture of AI-driven marketing automation platforms. Build scalable, secure, and high-performance systems integrated with APIs of platforms like Meta (Facebook/Instagram), Google Ads, YouTube, etc. Evaluate and implement ML/AI models for optimization, targeting, and performance prediction. Product Development & Innovation Drive product ideation with the CEO and Product teams, especially in campaign automation, predictive optimization, and real-time analytics. Prioritize features, experiments, and iterations to deliver impact at speed. Team Leadership Recruit, mentor, and manage cross-functional teams: engineers, AI/ML experts, DevOps, and QA. Instill a strong culture of execution, technical excellence, and continuous learning. Platform & Data Integration Lead the integration with ad platforms (Google Ads API, Meta Ads API, etc.) and ensure compliance with their policies and data protocols. Design systems to collect, clean, and process large-scale media and performance data in real time. Security, Scalability, and Infrastructure Ensure robust cloud architecture, infrastructure reliability, and data security. Own infrastructure cost optimization and ensure readiness for scale. Stakeholder Collaboration Collaborate with founders, investors, clients, and sales teams to align tech priorities with business goals. Translate complex technical concepts for non-technical stakeholders. Our ideal candidate: Proven experience 10+ years in software engineering, ideally with 3+ years in a leadership role at a product or SaaS company. Strong understanding of AdTech/MarTech platformsexperience building tools for digital advertising is a must. Deep expertise in backend systems, cloud (AWS/GCP), API integration (Meta/Google), and modern tech stacks. Solid experience with AI/ML applications in digital marketing: campaign automation, media mix modeling, audience segmentation, etc. Startup mindset: comfortable working in a fast-paced, iterative, and high-growth environment. Strong leadership, communication, and strategic thinking abilities Why join Climaty: Be part of a fast-growing startup at the intersection of AI and advertising. Own your market and make a direct impact on business outcomes. Collaborate with industry veterans and innovative thinkers. Show more Show less
Posted 1 day ago
5.0 - 7.0 years
0 Lacs
mumbai, maharashtra, india
On-site
About Us: Fluent Health is a dynamic healthcare startup revolutionizing how you manage your healthcare and that of your family. The company will provide customers with high-quality, personalized options, credible information through trustworthy content, and absolute privacy. To assist us in our growth journey, we are seeking a highly motivated and experienced Senior Data Engineer to play a pivotal role in future success. Company Website- https://fluentinhealth.com/ Job Description: Were looking for a Senior Data Engineer to lead the design, implementation, and optimization of our analytical and real-time data platform. In this hybrid role, youll combine hands-on data engineering with high-level architectural thinking to build scalable data infrastructure with ClickHouse as the cornerstone of our analytics and data warehousing strategy. Youll work closely with engineering, product, analytics, and compliance teams to establish data best practices, ensure data governance, and unlock insights for internal teams and future data monetization initiatives. Responsibilities: Architecture & Strategy: Own and evolve the target data architecture , with a focus on ClickHouse for large-scale analytical and real-time querying workloads. Define and maintain a scalable and secure data platform architecture that supports various use cases including real-time analytics, reporting, and ML applications. Set data governance and modeling standards , and ensure data lineage, integrity, and security practices are followed. Evaluate and integrate complementary technologies into the data stack (e.g., message queues, data lakes, orchestration frameworks). Data Engineering: Design, develop, and maintain robust ETL/ELT pipelines to ingest and transform data from diverse sources into our data warehouse. Optimize ClickHouse schema and query performance for real-time and historical analytics workloads. Build data APIs and interfaces for product and analytics teams to interact with the data platform. Implement monitoring and observability tools to ensure pipeline reliability and data quality. Collaboration & Leadership: Collaborate with data consumers (e.g., product managers, data analysts, ML engineers) to understand data needs and translate them into scalable solutions. Work with security and compliance teams to implement data privacy, classification, retention, and access control policies . Mentor junior data engineers and contribute to hiring efforts as we scale the team. Qualifications: 5-7 years of experience in Data Engineering , with at least 2-4 years in a Senior or Architectural role. Expert-level proficiency in ClickHouse or similar columnar databases (e.g., BigQuery, Druid, Redshift). Proven experience designing and operating scalable data warehouse and data lake architectures . Deep understanding of data modeling , partitioning , indexing , and query optimization techniques. Strong experience building ETL/ELT pipelines using tools like Airflow, dbt, or custom frameworks. Familiarity with stream processing and event-driven architecture (e.g., Kafka, Pub/Sub). Proficiency with SQL and at least one programming language like Python , Scala , or Java . Experience with data governance , compliance frameworks (e.g., HIPAA, GDPR), and data cataloging tools. Knowledge of real-time analytics use cases and streaming architectures. Familiarity with machine learning pipelines and integrating data platforms with ML workflows. Experience working in regulated or high-security domains like Healthtech , Fintech , or Enterprise SaaS. Show more Show less
Posted 6 days ago
7.0 - 9.0 years
0 Lacs
bengaluru, karnataka, india
Remote
About Sibros Technologies Who We Are Sibros is accelerating the future of SDV excellence with its Deep Connected Platform that orchestrates full vehicle software update management, vehicle analytics, and remote commands in one integrated system. Adaptable to any vehicle architecture, Sibros platform meets stringent safety, security, and compliance standards, propelling OEMs to innovate new connected vehicle use cases across fleet management, predictive maintenance, data monetization, and beyond. Learn more at www.sibros.tech. Our Mission Our mission is to help our customers get the most value out of their connected devices. Follow us on LinkedIn | Youtube | Instagram About The Role Job Title: Senior Software Engineer Experience: 6 - 9 years At Sibros, we are building the foundational data infrastructure that powers the software-defined future of mobility. One of our most impactful products Deep Logger enables rich, scalable, and intelligent data collection from connected vehicles , unlocking insights that were previously inaccessible. Our platform ingests high-frequency telemetry , diagnostic signals, user behavior, and system health data from vehicles across the globe. We transform this into actionable intelligence through real-time analytics , geofence-driven alerting , and predictive modeling for use cases like trip intelligence, fault detection, battery health , and driver safety . Were looking for a Senior Software Engineer to help scale the backend systems that support Deep Loggers data pipelinefrom ingestion and streaming analytics to long-term storage and ML model integration . Youll play a key role in designing high-throughput, low-latency systems that operate reliably in production, even as data volumes scale to billions of events per day. In this role, youll collaborate across firmware, data science, and product teams to deliver solutions that are not only technically robust, but also critical to safety, compliance, and business intelligence for OEMs and fleet operators. This is a unique opportunity to shape the real-time intelligence layer of connected vehicles , working at the intersection of event-driven systems, cloud-native infrastructure , and automotive-grade reliability . What Youll Do Lead the Design and Evolution of Scalable Data Systems: Architect end-to-end real-time and batch data processing pipelines that power mission-critical applications such as trip intelligence, predictive diagnostics, and geofence-based alerts. Drive system-level design decisions and guide the team through technology tradeoffs. Mentor and Uplift the Engineering Team: Act as a technical mentor to junior and mid-level engineers. Conduct design reviews, help grow data engineering best practices, and champion engineering excellence across the team. Partner Across the Stack and the Org: Collaborate cross-functionally with firmware, frontend, product, and data science teams to align on roadmap goals. Translate ambiguous business requirements into scalable, fault-tolerant data systems with high availability and performance guarantees. Drive Innovation and Product Impact: Shape the technical vision for real-time and near-real-time data applications. Identify and introduce cutting-edge open-source or cloud-native tools that improve system reliability, observability, and cost efficiency. Operationalize Systems at Scale: Own the reliability, scalability, and performance of the pipelines you and the team build. Lead incident postmortems, drive long-term stability improvements, and establish SLAs/SLOs that balance customer value with engineering complexity. Contribute to Strategic Technical Direction: Provide thought leadership on evolving architectural patterns, such as transitioning from streaming-first to hybrid batch-stream systems for cost and scale efficiency. Proactively identify bottlenecks, tech debt, and scalability risks. What You Should Know 7+ years of experience in software engineering with a strong emphasis on building and scaling distributed systems in production environments. Deep understanding of computer science fundamentals including data structures, algorithms, concurrency, and distributed computing principles. Proven expertise in designing, building, and maintaining large-scale, low-latency data systems for real-time and batch processing. Hands-on experience with event-driven architectures and messaging systems like Apache Kafka, Pub/Sub, or equivalent technologies. Strong proficiency in stream processing frameworks such as Apache Beam, Flink, or Google Cloud Dataflow, with a deep appreciation for time and windowing semantics, backpressure, and checkpointing. Demonstrated ability to write production-grade code in Go or Java, following clean architecture principles and best practices in software design. Solid experience with cloud-native infrastructure including Kubernetes, serverless compute (e.g., AWS Lambda, GCP Cloud Functions), and containerized deployments using CI/CD pipelines. Proficiency with cloud platforms, especially Google Cloud Platform (GCP) or Amazon Web Services (AWS), and services like BigQuery, S3/GCS, IAM, and managed Kubernetes (GKE/EKS). Familiarity with observability stacks (e.g., Prometheus, Grafana, OpenTelemetry) and an understanding of operational excellence in production environments. Ability to balance pragmatism with technical rigor, navigating ambiguity to design scalable and cost-effective solutions. Passionate about building platforms that empower internal teams and deliver meaningful insights to customers, especially within the automotive, mobility, or IoT domains. Strong communication and collaboration skills, with experience working closely across product, firmware, and analytics teams. Preferred Qualifications Experience architecting and building systems for large-scale IoT or telemetry-driven applications, including ingestion, enrichment, storage, and real-time analytics. Deep expertise in both streaming and batch data processing paradigms, using tools such as Apache Kafka, Apache Flink, Apache Beam, or Google Cloud Dataflow. Hands-on experience with cloud-native architectures on platforms like Google Cloud Platform (GCP), AWS, or Azure, leveraging services such as Pub/Sub, BigQuery, Cloud Functions, Kinesis etc. Experience working with high-performance time-series or analytical databases such as ClickHouse, Apache Druid, or InfluxDB, optimized for millisecond-level insights at scale. Proven ability to design resilient, fault-tolerant pipelines that ensure data quality, integrity, and observability in high-throughput environments. Familiarity with schema evolution, data contracts, and streaming-first data architecture patterns (e.g., Change Data Capture, event sourcing). Experience working with geospatial data, telemetry, or real-time alerting systems is a strong plus. Contributions to open-source projects in the data or infrastructure ecosystem, or active participation in relevant communities, are valued. What We Offer Competitive compensation package and benefits. A dynamic work environment with a flat hierarchy and the opportunity for rapid career advancement. Collaborate with a dynamic team thats passionate about solving complex problems in the automotive IoT space. Access to continuous learning and development opportunities. Flexible working hours to accommodate different time zones. Comprehensive benefits package including health insurance and wellness programs. A culture that values innovation and promotes a work-life balance. Show more Show less
Posted 1 week ago
4.0 - 7.0 years
4 - 6 Lacs
navi mumbai, maharashtra, india
On-site
Responsibilities: Work experience in any of IOTs handling huge traffic, big data & real-time analytics. Should have a basic understanding of any of Distributed Messaging Queue (Kafka, MQTT, RabbitMQ etc) Must have any experience in any of Web/Application Server (Apache, Jboss, Tomcat, WebLogic, etc.,) Good to have knowledge in any of NoSQL(MongoDB, Hbase, etc.,), Spark, Hadoop environment understanding. SQL/Oracle databases & XML files operations Knowledge of Jmeter/LoadRunner tool for Performance Testing Experience in REST API Testing Cloud Computing fundamentals (I.e Type of Services, Pros/Cons, etc.) Knowledge of Automation Strategies and Frameworks. Linux commands, linux based os debugging skils Bug life cycle/defect logging (Hands-on varied/any defect tracking tools like HP ALM/ Rational IBM ClearQuest/Jira etc) Experience in Application testing on various platform(Android, iOS)(Mobile / STB / Notebook / Tablet) Knoweldge of Software Development Lifecycle (Agile, Merge Request, etc.) Agile Software Development Methodologies Good to Have: Demonstarated knowledge of 3G/LTE/NB IoT technology with knowledge of NAS/RRC layers log analysis w.r.t call flow from UE side Hands on device side testing, debugging and issue analysis. Basic knowledge of UE testing with debugging skills for device troubleshooting and log analysis Ability to understand SIMModem interface and message exchange Understanding of NB-IoT/IoT (3GPP Release 13 and onwards) Data base/ETL / Backend Testing is advantage
Posted 2 weeks ago
0.0 years
0 Lacs
india
On-site
Microsoft is a company where passionate innovators come to collaborate, envision what can be and take their careers further. This is a world of more possibilities, more innovation, more openness, and the sky is the limit thinking in a cloud-enabled world. Microsoft's Azure Data engineering team is leading the transformation of analytics in the world of data with products like databases, data integration, big data analytics, messaging & real-time analytics, and business intelligence. Our mission is to build the data platform for the age of AI, powering a new class of data-first applications and driving a data culture. We are looking to hire Software Development Engineers to join our team. Within Azure Data, the databases team builds and maintains Microsoft's operational Database systems. We store and manage data in a structured way to enable a multitude of applications across various industries. We are the Azure SQL Database Storage/IO resource governance team , part of the Microsoft Azure s C+ AI organization. Our team is responsible to ensure high efficiency of operations while interacting with storage which persists with the data. In addition to maintaining an efficient Quality of Service (QoS) with respect to latency and bandwidth, the team is also responsible for ensuring the customer gets the right kind of storage medium based on their need and historical usage pattern. Based on the customer needs, the team is responsible for putting them on the right tier of storage, moving them across to a different storage medium if desired with minimal downtime, analyze and rectify customer workload for efficient storage utilization . Azure SQL has more than 100 million databases under its belt, and for its efficient operation has an even higher number of storage resources allocated . The team is responsible for efficient functioning of storage on all parameters and for all services. At your disposal will be SQL's state-of-the-art management system, a sophisticated engine, and terabytes of telemetry to make informed decisions. With this unprecedented growth comes added complexity in our database storage ecosystem, at scales we never imagined before. As a Senior Software Engineer, this is your opportunity to take on a challenge to build and improve database storage infra with cutting-edge technologies. This team is on a mission to define and deliver a world class storage infra for relational database, that position infrastructure as a key differentiator for customer value and business margins. We do not just value differences or different perspectives. We seek them out and invite them in so we can tap into the collective power of everyone in the company. As a result, our customers are better served. Microsoft's mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond.
Posted 2 weeks ago
3.0 - 5.0 years
0 Lacs
india
On-site
DESCRIPTION Are you interested in building high-performance, globally scalable Financial systems that support Amazon's current and future growth Are you seeking an environment where you can drive innovation leveraging the scalability and innovation with Amazon's AWS cloud services Do you have a passion for ensuring a positive customer experience This is the job for you. Amazon's Finance Technology organization (FinTech) is responsible for building and maintaining the critical finance technology applications that enable new business growth, ensure compliance with financial and tax reporting obligations, and provide deep analysis of Amazon's financial data. This function is of paramount importance to the company, as it underpins Amazon's ability to effectively manage its finances and drive continued expansion. At the heart of FinTech's mission is the General Ledger team, which builds and operates the technologies to account for and post millions of financial transactions daily to support accurate internal and external financial reporting. This team processes on average 371MM+ transactions per month, servicing the accounting needs of Finance, Accounting, and Tax teams worldwide. The work of the General ledger team is absolutely essential to meeting Amazon's critical close timelines and maintaining the integrity of the company's financial data. Amazon Financial Technology Team is looking for a results-oriented, driven software development engineer, who can help us create the next generation of distributed, scalable financial systems. Our ideal candidate thrives in a fast-paced environment, enjoys the challenge of highly complex business contexts that are typically being defined in real-time. We need someone to design and develop services that facilitate global financial transactions worth billions (USD) annually. This is a unique opportunity to be part of a mission-critical initiative with significant organizational visibility and impact. Design Foundational Greenfield Services: You will collaborate with your team to architect and implement the core services that will form the backbone of this new accounting software. Your technical expertise and innovative thinking will be instrumental in ensuring the foundational services are designed with scalability, reliability, and performance in mind for Amazon. Adopting Latest Technology: You will have the chance to work with the latest technologies, frameworks, and tools to build these foundational services. This includes leveraging advancements in areas such as cloud computing, distributed systems, data processing, and real-time analytics. Solving High-Scale Processing Challenges: This project will involve handling millions of transactions per day, presenting you with the unique challenge of designing and implementing robust, high-performance solutions that can handle this scale of volume efficiently. You will be challenged to tackle complex problems related to data processing, queuing, and real-time analytics. Cross-Functional and Senior Engineer Collaboration: You will work closely with cross-functional teams, including product managers, data engineers, and accountants. You will also be working directly with multiple Principal Engineers and presenting your work to Senior Principal Engineers. This experience will give you the opportunities and visibility to help build the required leadership skills to enhance your career. Key job responsibilities - Define high level and low level design for software solutions using the latest AWS technology in a large distributed environment. - Take the lead on defining and implementing engineering best practices and using data to define and improve operational best practices. - Help drive the architecture and technology choices for FinTech accounting products. - Design, develop and deploy medium to large software solutions for Amazon accounting needs. - Raise the bar on code quality, including security, readability, consistency, maintainability. About the team At the heart of FinTech's mission is the General Ledger team, which builds and operates the technologies to account for and post millions of financial transactions daily to support accurate internal and external financial reporting. This team processes on average 371MM+ transactions per month, servicing the accounting needs of Finance, Accounting, and Tax teams worldwide. The work of the General ledger team is absolutely essential to meeting Amazon's critical close timelines and maintaining the integrity of the company's financial data. BASIC QUALIFICATIONS - 3+ years of non-internship professional software development experience - 2+ years of non-internship design or architecture (design patterns, reliability and scaling) of new and existing systems experience - Experience programming with at least one software programming language - Bachelor's degree PREFERRED QUALIFICATIONS - 3+ years of full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations experience - Bachelor's degree in computer science or equivalent Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit for more information. If the country/region you're applying in isn't listed, please contact your Recruiting Partner.
Posted 3 weeks ago
3.0 - 5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
This job is with Amazon, an inclusive employer and a member of myGwork the largest global platform for the LGBTQ+ business community. Please do not contact the recruiter directly. Description Are you interested in building high-performance, globally scalable Financial systems that support Amazon&aposs current and future growth Are you seeking an environment where you can drive innovation leveraging the scalability and innovation with Amazon&aposs AWS cloud services Do you have a passion for ensuring a positive customer experience This is the job for you. Amazon&aposs Finance Technology organization (FinTech) is responsible for building and maintaining the critical finance technology applications that enable new business growth, ensure compliance with financial and tax reporting obligations, and provide deep analysis of Amazon&aposs financial data. This function is of paramount importance to the company, as it underpins Amazon&aposs ability to effectively manage its finances and drive continued expansion. At the heart of FinTech&aposs mission is the General Ledger team, which builds and operates the technologies to account for and post millions of financial transactions daily to support accurate internal and external financial reporting. This team processes on average 371MM+ transactions per month, servicing the accounting needs of Finance, Accounting, and Tax teams worldwide. The work of the General ledger team is absolutely essential to meeting Amazon&aposs critical close timelines and maintaining the integrity of the company&aposs financial data. Amazon Financial Technology Team is looking for a results-oriented, driven software development engineer, who can help us create the next generation of distributed, scalable financial systems. Our ideal candidate thrives in a fast-paced environment, enjoys the challenge of highly complex business contexts that are typically being defined in real-time. We need someone to design and develop services that facilitate global financial transactions worth billions (USD) annually. This is a unique opportunity to be part of a mission-critical initiative with significant organizational visibility and impact. Design Foundational Greenfield Services: You will collaborate with your team to architect and implement the core services that will form the backbone of this new accounting software. Your technical expertise and innovative thinking will be instrumental in ensuring the foundational services are designed with scalability, reliability, and performance in mind for Amazon. Adopting Latest Technology: You will have the chance to work with the latest technologies, frameworks, and tools to build these foundational services. This includes leveraging advancements in areas such as cloud computing, distributed systems, data processing, and real-time analytics. Solving High-Scale Processing Challenges: This project will involve handling millions of transactions per day, presenting you with the unique challenge of designing and implementing robust, high-performance solutions that can handle this scale of volume efficiently. You will be challenged to tackle complex problems related to data processing, queuing, and real-time analytics. Cross-Functional and Senior Engineer Collaboration: You will work closely with cross-functional teams, including product managers, data engineers, and accountants. You will also be working directly with multiple Principal Engineers and presenting your work to Senior Principal Engineers. This experience will give you the opportunities and visibility to help build the required leadership skills to enhance your career. Key job responsibilities Define high level and low level design for software solutions using the latest AWS technology in a large distributed environment. Take the lead on defining and implementing engineering best practices and using data to define and improve operational best practices. Help drive the architecture and technology choices for FinTech accounting products. Design, develop and deploy medium to large software solutions for Amazon accounting needs. Raise the bar on code quality, including security, readability, consistency, maintainability. About The Team At the heart of FinTech&aposs mission is the General Ledger team, which builds and operates the technologies to account for and post millions of financial transactions daily to support accurate internal and external financial reporting. This team processes on average 371MM+ transactions per month, servicing the accounting needs of Finance, Accounting, and Tax teams worldwide. The work of the General ledger team is absolutely essential to meeting Amazon&aposs critical close timelines and maintaining the integrity of the company&aposs financial data. Basic Qualifications 3+ years of non-internship professional software development experience 2+ years of non-internship design or architecture (design patterns, reliability and scaling) of new and existing systems experience Experience programming with at least one software programming language Bachelor&aposs degree Preferred Qualifications 3+ years of full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations experience Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you&aposre applying in isn&apost listed, please contact your Recruiting Partner. Show more Show less
Posted 1 month ago
3.0 - 7.0 years
12 - 15 Lacs
Hyderabad, Bengaluru, Delhi / NCR
Work from Office
We are looking for an experienced Data Engineer/BI Developer with strong hands-on expertise in Microsoft Fabric technologies, including OneLake, Lakehouse, Data Lake, Warehouse, and Real-Time Analytics, along with proven skills in Power BI, Azure Synapse Analytics, and Azure Data Factory (ADF). The ideal candidate should also possess working knowledge of DevOps practices for data engineering and deployment automation. Key Responsibilities: Design and implement scalable data solutions using Microsoft Fabric components: OneLake, Data Lake, Lakehouse, Warehouse, and Real-Time Analytics Build and manage end-to-end data pipelines integrating structured and unstructured data from multiple sources. Integrate Microsoft Fabric with Power BI, Synapse Analytics, and Azure Data Factory to enable modern data analytics solutions. Develop and maintain Power BI datasets, dashboards, and reports using data from Fabric Lakehouses or Warehouses. Implement data governance, security, and compliance policies within the Microsoft Fabric ecosystem. Collaborate with stakeholders for requirements gathering, data modeling, and performance tuning. Leverage Azure DevOps / Git for version control, CI/CD pipelines, and deployment automation of data artifacts. Monitor, troubleshoot, and optimize data flows and transformations for performance and reliability. Required Skills: 38 years of experience in data engineering, BI development, or similar roles. Strong hands-on experience with Microsoft Fabric ecosystem:OneLake, Data Lake, Lakehouse, Warehouse, Real-Time Analytics Proficient in Power BI for interactive reporting and visualization. Experience with Azure Synapse Analytics, ADF (Azure Data Factory), and related Azure services. Good understanding of data modeling, SQL, T-SQL, and Spark/Delta Lake concepts. Working knowledge of DevOps tools and CI/CD processes for data deployment (Azure DevOps preferred). Familiarity with DataOps and version control practices for data solutions. Preferred Qualifications: Microsoft certifications (e.g., DP-203, PL-300, or Microsoft Fabric certifications) are a plus. Experience with Python, Notebooks, or KQL for Real-Time Analytics is advantageous. Knowledge of data governance tools (e.g., Microsoft Purview) is a plus. Location: Remote- Bengaluru,Hyderabad,Delhi / NCR,Chennai,Pune,Kolkata,Ahmedabad,Mumbai
Posted 2 months ago
0.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Assistant Vice President, Enterprise Architecture Consulting-GCP Delivery lead The Delivery Lead will be accountable for the effective execution of extensive data transformation initiatives utilizing Google Cloud Platform. This leadership position entails supervising both legacy-to-GCP migrations and new implementations, guaranteeing high-quality delivery, innovation, and business value. The suitable candidate should possess significant experience in GCP program management, as well as proficiency in data engineering, cloud platforms, and analytics solutions. They will be tasked with client engagement, team leadership, delivery governance, and strategic innovations in GCP-based solutions. Key Responsibilities: Lead end-to-end delivery of GCP projects , including migrations from legacy systems and greenfield implementations . Define and enforce delivery governance frameworks , best practices, and methodologies for GCP programs. Act as the primary interface for clients , ensuring strong relationships and alignment with their data strategy. Offer expert guidance on GCP and contemporary data architectures such as Data mesh/fabric methodology , while possessing substantial experience in SSOT frameworks and guiding clients on best practices. Possessing knowledge of containerization architecture is essential, along with experience in data vault data modeling. . Build, mentor, and manage a high-performing team of GCP architects, data engineers, and analysts. Drive team upskilling and certifications in GCP, data engineering, and analytics tools. Foster a strong DevOps and Agile culture , ensuring efficient execution through CI/CD automation. Stay ahead of emerging trends in GCP, cloud data engineering, and analytics to drive innovation. Must possess substantial experience in advanced GCP techniques such as BI engine and history-based optimization, among others. Should have a comprehensive understanding and practical experience with GenAI and Agentic AI. The individual is expected to review the architectural deck and offer solutions for identified pain points, ensuring successful project delivery. Proficient in ELT solutioning utilizing DBT and the native services of BQ-Dataform. Promote AI/ML, automation, and real-time analytics to enhance data platform capabilities. Develop accelerators, reusable frameworks, and best practices for efficient delivery. Ensure data security, compliance, and regulatory adherence in projects. Implement performance monitoring, cost optimization, and disaster recovery strategies for GCP solutions. Minimum Qualifications Bachelor&rsquos degree in Computer Science , Engineering, or a related field (Master&rsquos or MBA preferred). experience in IT Services, with exp in GCP and cloud-based data solutions. Preferred Qualifications/ Skills Proven track record in managing large-scale GCP programs , including legacy data migrations and new implementations. Deep understanding of data engineering, ETL, and cloud-native architectures. Strong expertise in GCP ecosystems, including Streams, Orchestration, Ingestion, Governance , stewardship , ELT/ETL, Tasks, Data Sharing, and Performance Optimization. Experience with cloud platforms (AWS, Azure). Proficiency in SQL, Python, Spark, and modern data processing frameworks. Preferred Certifications: Certified GCP Solution Architect. Cloud certifications (AWS Certified Data Analytics, Azure Data Engineering, or equivalent). PMP, ITIL, or SAFe Agile certifications for delivery governance. Why join Genpact Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.
Posted 2 months ago
6.0 - 11.0 years
10 - 18 Lacs
Chennai
Work from Office
Design systems for local data processing. Implement solutions for IoT and real-time analytics. Ensure system security and performance.
Posted 2 months ago
6.0 - 11.0 years
10 - 18 Lacs
Bengaluru
Work from Office
Design systems for local data processing. Implement solutions for IoT and real-time analytics. Ensure system security and performance.
Posted 2 months ago
6.0 - 11.0 years
10 - 18 Lacs
Hyderabad
Work from Office
Design systems for local data processing. Implement solutions for IoT and real-time analytics. Ensure system security and performance.
Posted 2 months ago
6.0 - 11.0 years
10 - 18 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Design systems for local data processing. Implement solutions for IoT and real-time analytics. Ensure system security and performance.
Posted 3 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |