Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
At eBay, we're more than a global ecommerce leader — we’re changing the way the world shops and sells. Our platform empowers millions of buyers and sellers in more than 190 markets around the world. We’re committed to pushing boundaries and leaving our mark as we reinvent the future of ecommerce for enthusiasts. Our customers are our compass, authenticity thrives, bold ideas are welcome, and everyone can bring their unique selves to work — every day. We're in this together, sustaining the future of our customers, our company, and our planet. Join a team of passionate thinkers, innovators, and dreamers — and help us connect people and build communities to create economic opportunity for all. Machine Learning Engineer (T25), Product Knowledge Do you love Big Data? Deploying Machine Learning models? Challenging optimization problems? Knowledgeable, collaborative co-workers? Come work at eBay and help us redefine global, online commerce! Who Are We? The Product Knowledge team is at the epicenter of eBay’s Tech-driven, Customer-centric overhaul. Our team is entrusted with creating and using eBay’s Product Knowledge - a vast Big Data system which is built up of listings, transactions, products, knowledge graphs, and more. Our team has a mix of highly proficient people from multiple fields such as Machine Learning, Data Science, Software Engineering, Operations, and Big Data Analytics. We have a strong culture of collaboration, and plenty of opportunity to learn, make an impact, and grow! What Will You Do We Are Looking For Exceptional Engineers, Who Take Pride In Creating Simple Solutions To Apparently-complex Problems. Our Engineering Tasks Typically Involve At Least One Of The Following Building a pipeline that processes up to billions of items, frequently employing ML models on these datasets Creating services that provide Search or other Information Retrieval capabilities at low latency on datasets of hundreds of millions of items Crafting sound API design and driving integration between our Data layers and Customer-facing applications and components Designing and running A/B tests in Production experiences in order to vet and measure the impact of any new or improved functionality If you love a good challenge, and are good at handling complexity - we’d love to hear from you! eBay is an amazing company to work for. Being on the team, you can expect to benefit from: A competitive salary - including stock grants and a yearly bonus A healthy work culture that promotes business impact and at the same time highly values your personal well-being Being part of a force for good in this world - eBay truly cares about its employees, its customers, and the world’s population, and takes every opportunity to make this clearly apparent Job Responsibilities Design, deliver, and maintain significant features in data pipelines, ML processing, and / or service infrastructure Optimize software performance to achieve the required throughput and / or latency Work with your manager, peers, and Product Managers to scope projects and features Come up with a sound technical strategy, taking into consideration the project goals, timelines, and expected impact Take point on some cross-team efforts, taking ownership of a business problem and ensuring the different teams are in sync and working towards a coherent technical solution Take active part in knowledge sharing across the organization - both teaching and learning from others Minimum Qualifications Passion and commitment for technical excellence B.Sc. or M.Sc. in Computer Science or an equivalent professional experience 2+ years of software design and development experience, tackling non-trivial problems in backend services and / or data pipelines A solid foundation in Data Structures, Algorithms, Object-Oriented Programming, Software Design, and core Statistics knowledge Experience in production-grade coding in Java, and Python/Scala Experience in the close examination of data and computation of statistics Experience in using and operating Big Data processing pipelines, such as: Hadoop and Spark Good verbal and written communication and collaboration skills Please see the Talent Privacy Notice for information regarding how eBay handles your personal data collected when you use the eBay Careers website or apply for a job with eBay. eBay is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, sex, sexual orientation, gender identity, veteran status, and disability, or other legally protected status. If you have a need that requires accommodation, please contact us at talent@ebay.com. We will make every effort to respond to your request for accommodation as soon as possible. View our accessibility statement to learn more about eBay's commitment to ensuring digital accessibility for people with disabilities. The eBay Jobs website uses cookies to enhance your experience. By continuing to browse the site, you agree to our use of cookies. Visit our Privacy Center for more information.
Posted 5 days ago
6.0 - 10.0 years
0 - 3 Lacs
Pune, Chennai
Hybrid
Greetings, We have an opening for one of our clients " Mphasis " for " Scala Developer " position. Role & responsibilities Bachelor's or Master's degree in Computer Science, Engineering, or a related quantitative field. Minimum 5 years of professional experience in Data Engineering, with a strong focus on big data technologies. Proficiency in Scala for developing big data applications and transformations, especially with Apache Spark. Expert-level proficiency in SQL; ability to write complex queries, optimize performance, and understand database internals. Extensive hands-on experience with Apache Spark (Spark SQL, DataFrames, RDDs) for large-scale data processing and analytics. Solid understanding of distributed computing concepts and experience with the Hadoop ecosystem (HDFS, Hive). Experience with building and optimizing ETL/ELT processes and data warehousing concepts. Strong understanding of data modeling techniques (e.g., Star Schema, Snowflake Schema). Familiarity with version control systems (e.g., Git). Excellent problem-solving, analytical, and communication skills. Ability to work independently and collaboratively in an Agile team environment. Work mode : Hybrid ( 3 days ) Location : Pune / Chennai
Posted 5 days ago
4.0 - 9.0 years
9 - 13 Lacs
Pune
Work from Office
Your Position You will work as a Data Engineer with Machine Learning expertise in the Predictive Maintenance team. This hybrid and multi-cultural team includes Data Scientists, Machine Learning Engineers, Data Engineers, a DevOps Engineer, a QA Engineer, an Architect, a UX Designer, a Scrum Master, and a Product Owner. The Digital Service Platform focuses on optimizing customer asset usage and maintenance, impacting performance, cost, and sustainability KPIs by extending component lifetimes. In your role, you will: Participate in solution design discussions led by our Product Architect, where your input as a Data Engineer with ML expertise is highly valued. Collaborate with IT and business SMEs to ensure delivery of high-quality end-to-end data and machine learning pipelines. Your Responsibilities Data Engineering Develop, test, and document data (collection and processing) pipelines for Predictive Maintenance solutions, including data from (IoT) sensors and control components to our data platform. Build scalable pipelines to transform, aggregate, and make data available for machine learning models. Align implementation efforts with other back-end developers across multiple development teams. Machine Learning Integration Collaborate with Data Scientists to integrate machine learning models into production pipelines, ensuring smooth deployment and scalability. Develop and optimize end-to-end machine learning pipelines (MLOps) from data preparation to model deployment and monitoring. Work on model inference pipelines, ensuring efficient real-time predictions from deployed models. Implement automated retraining workflows and ensure version control for datasets and models. Continuous Improvement Contribute to the design and build of a CI/CD pipeline , including integration test automation for data and ML pipelines. Continuously improve and standardize data and ML services for customer sites to reduce project delivery time. Actively monitor model performance and ensure timely updates or retraining as needed. Your Profile Minimum 4 years' experience building complex data pipelines and integrating machine learning solutions. Bachelor's or Master's degree in Computer Science, IT, Data Science, or equivalent. Hands-on experience with data modeling and machine learning workflows . Strong programming skills in Java , Scala , and Python (preferred for ML tasks). Experience with stream processing frameworks (e.g., Spark) and streaming storage (e.g., Kafka). Proven experience with MLOps practices, including data preprocessing, model deployment, and monitoring. Familiarity with ML frameworks and tools (e.g., TensorFlow, PyTorch, MLflow). Proficient in cloud platforms (preferably Azure and Databricks). Experience with data quality management , monitoring, and ensuring robust pipelines. Knowledge of Predictive Maintenance model development is a strong plus. What Youll Gain Opportunity to work at the forefront of data-driven innovation in a global organization. Collaborate with a talented and diverse team to design and implement cutting-edge solutions. Expand your expertise in data engineering and machine learning in a real-world industrial setting. If you are passionate about leveraging data and machine learning to drive innovation, wed love to hear from you!
Posted 5 days ago
1.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Associate Analyst - Product Data & Analytics Job Description Product Data & Analytics Team Associate Analyst – Product Data & Analytics Overview The Product Data & Analytics team builds internal analytic partnerships, strengthening focus on the health of the business, portfolio and revenue optimization opportunities, initiative tracking, new product development and Go-To Market strategies. We are a hands-on global team providing scalable end-to-end data solutions. Working closely with the business, we influence decisions across Mastercard through data driven insights. We are a team on analytics engineers, data architects, BI developers, data analysts and data scientists, and fully manage our own data assets and solutions. Are you excited about Data Assets and the value they bring to an organization? Are you an evangelist for data driven decision making? Are you motivated to be part of a Global Analytics team that builds large scale Analytical Capabilities supporting end users across 6 continents? Are you interested in proactively looking to improve data driven decisions for a global corporation? The ideal candidate has a knack for seeing solutions in sprawling data sets and the business mindset to convert insights into strategic opportunities for our company. Role and responsibilities The Associate Analyst will be part of a strategic initiative to create a SSOT data platform for all transactional data assets within the organization. He/She will work alongside a team of analytics engineers, data analysts, data engineers to evaluate the current use cases across the organization, define the data platform design including the logical/conceptual data model, data mappings, and other platform documentation, collaborate with the data architects, data engineers to ensure platform build and will be responsible for the UAT before implementation. Collaborate with team members to collect business requirements, define successful analytics outcomes, and design data models. Serve as the Directly Responsible Individual for major sections of the platform logical/conceptual data model. Define data mappings, data dictionaries, data quality and UAT documentation. Maintain the Data Catalog, a scalable resource to support Self-Service and Single-source-of-truth analytics. Translate business requirements into tangible technical solution specifications and high quality, on time deliverables. Effectively use tools to manipulate large-scale databases, synthesizing data insights. Apply quality control, data validation, and cleansing processes to new and existing data sources. Implement the DataOps philosophy in everything you do. Craft code that meets our internal standards for style, maintainability, and best practices for a high-scale database environment. Maintain and advocate for these standards through code review. Collaborate with cross-functional teams, external vendors teams and technology suppliers to ensure delivery of high-quality services. All About You 6 months to 1 years’ experience in data analysis, data mining, data analytics, data reporting and data product development. Financial Institution or a Payments experience a plus. Proactive self-starter seeking initiatives to advance. Understanding of Data architecture and some experience in building logical/conceptual data models or creating data mapping documentation. Experience with data validation, quality control and cleansing processes to new and existing data sources. Advanced SQL skills, ability to write optimized queries for large data sets. Experience on Platforms/Environments: Cloudera Hadoop, Big data technology stack, SQL Server, Microsoft BI Stack, Cloud. Exposure to Python, Scala, Spark, Cloud and other related technologies a plus. Experience with data visualization tools such as Tableau, Domo, and/or PowerBI is a plus. Excellent problem solving, quantitative and analytical skills. In depth technical knowledge, drive and ability to learn new technologies. Strong attention to detail and quality. Team player, excellent communication skills. Must be able to interact with management, internal stakeholders and collect requirements. Must be able to perform in a team, use judgment and operate under ambiguity. Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines.
Posted 5 days ago
7.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Apply Now Gurugram, India About Agoda Agoda is an online travel booking platform for accommodations, flights, and more. We build and deploy cutting-edge technology that connects travelers with a global network of 4.7M hotels and holiday properties worldwide, plus flights, activities, and more. Based in Asia and part of Booking Holdings, our 7,100+ employees representing 95+ nationalities in 27 markets foster a work environment rich in diversity, creativity, and collaboration. We innovate through a culture of experimentation and ownership, enhancing the ability for our customers to experience the world. Our Purpose – Bridging the World Through Travel We believe travel allows people to enjoy, learn and experience more of the amazing world we live in. It brings individuals and cultures closer together, fostering empathy, understanding and happiness. We are a skillful, driven and diverse team from across the globe, united by a passion to make an impact. Harnessing our innovative technologies and strong partnerships, we aim to make travel easy and rewarding for everyone. Get to Know our Team: In Agoda’s Back End Engineering department, we build the scalable, fault-tolerant systems and APIs that host our core business logic. Our systems cover all major areas of our business: inventory and pricing, product information, customer data, communications, partner data, booking systems, payments, and more. These mission-critical systems change frequently with dozens of releases per day, so we must employ state-of-the-art CI/CD and testing techniques in order to make sure everything works without any downtime. We also ensure that our systems are self-healing, responding gracefully to extreme loads or unexpected input. In order to accomplish this, we use state-of-the-art languages like Scala and Go, data technologies like Kafka and Aerospike, and agile development practices. Most importantly though, we hire great people from all around the world and empower them to be successful. Whether it’s building new projects like Flights and Packages or reimagining our existing business, you’ll make a big impact as part of the Back End Engineering team. The Opportunity: Agoda is looking for developers to work on mission critical systems that deal with the designing and development of APIs that serve millions of user search requests a day. In this Role, you’ll get to Lead development of features, experiments, technical projects and complex systems Be a technical architect, mentor, and driver towards the right technology Continue to evolve our architecture and build better software Be a major contributor to our agile and scrum practices Get involved with software engineering and collaborate with server, other client, and infrastructure technical team members to build the best solution Constantly look for ways to improve our products, code-base and development practices Write great code and help others write great code Drive Technical decisions in the organization What You’ll Need To Succeed 7+ years’ experience under your belt developing performance-critical applications that run in a production environment using Scala, Java or C# Experience in leading projects, initiatives and/or teams, with full ownership of the systems involved Data platforms like SQL, Cassandra or Hadoop. You understand that different applications have different data requirements Good understanding of algorithms and data structures Strong coding ability You are passionate about the craft of software development and constantly work to improve your knowledge and skills Excellent verbal and written English communication skills It’s Great If You Have Experience with Scrum/Agile development methodologies Experience building large-scale distributed products Core engineering infrastructure tools like Git for source control, TeamCity for Continuous Integration and Puppet for deployment Hands-on experience working with technology like queueing systems (Kafka, RabbitMQ, ActiveMQ, MSMQ), Spark, Hadoop, NoSQL (Cassandra, MongoDB), Play framework, Akka library #india #newdelhi #Bangalore #Bengaluru #Pune #Hyderabad #Chennai #Kolkata #Lucknow #IT #ENG #4 #Mumbai #Delhi #Noida Equal Opportunity Employer At Agoda, we pride ourselves on being a company represented by people of all different backgrounds and orientations. We prioritize attracting diverse talent and cultivating an inclusive environment that encourages collaboration and innovation. Employment at Agoda is based solely on a person’s merit and qualifications. We are committed to providing equal employment opportunity regardless of sex, age, race, color, national origin, religion, marital status, pregnancy, sexual orientation, gender identity, disability, citizenship, veteran or military status, and other legally protected characteristics. We will keep your application on file so that we can consider you for future vacancies and you can always ask to have your details removed from the file. For more details please read our privacy policy. Disclaimer We do not accept any terms or conditions, nor do we recognize any agency’s representation of a candidate, from unsolicited third-party or agency submissions. If we receive unsolicited or speculative CVs, we reserve the right to contact and hire the candidate directly without any obligation to pay a recruitment fee. Copy Link Line WeChat LinkedIn Email
Posted 5 days ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Senior Software Engineer Overview We are the global technology company behind the world’s fastest payments processing network. We are a vehicle for commerce, a connection to financial systems for the previously excluded, a technology innovation lab, and the home of Priceless®. We ensure every employee has the opportunity to be a part of something bigger and to change lives. We believe as our company grows, so should you. We believe in connecting everyone to endless, priceless possibilities. Our Team Within Mastercard – Data & Services The Data & Services team is a key differentiator for Mastercard, providing the cutting-edge services that are used by some of the world's largest organizations to make multi-million dollar decisions and grow their businesses. Focused on thinking big and scaling fast around the globe, this agile team is responsible for end-to-end solutions for a diverse global customer base. Centered on data-driven technologies and innovation, these services include payments-focused consulting, loyalty and marketing programs, business Test & Learn experimentation, and data-driven information and risk management services. Data Analytics And AI Solutions (DAAI) Program Within the D&S Technology Team, the DAAI program is a relatively new program that is comprised of a rich set of products that provide accurate perspectives on Portfolio Optimization, and Ad Insights. Currently, we are enhancing our customer experience with new user interfaces, moving to API and web application-based data publishing to allow for seamless integration in other Mastercard products and externally, utilizing new data sets and algorithms to further analytic capabilities, and generating scalable big data processes. We are looking for an innovative software engineering lead who will lead the technical design and development of an Analytic Foundation. The Analytic Foundation is a suite of individually commercialized analytical capabilities (think prediction as a service, matching as a service or forecasting as a service) that also includes a comprehensive data platform. These services will be offered through a series of APIs that deliver data and insights from various points along a central data store. This individual will partner closely with other areas of the business to build and enhance solutions that drive value for our customers. Engineers work in small, flexible teams. Every team member contributes to designing, building, and testing features. The range of work you will encounter varies from building intuitive, responsive UIs to designing backend data models, architecting data flows, and beyond. There are no rigid organizational structures, and each team uses processes that work best for its members and projects. Here are a few examples of products in our space: Portfolio Optimizer (PO) is a solution that leverages Mastercard’s data assets and analytics to allow issuers to identify and increase revenue opportunities within their credit and debit portfolios. Ad Insights uses anonymized and aggregated transaction insights to offer targeting segments that have high likelihood to make purchases within a category to allow for more effective campaign planning and activation. Help found a new, fast-growing engineering team! Position Responsibilities As a Senior Software Engineer, you will: Play a large role in scoping, design and implementation of complex features Push the boundaries of analytics and powerful, scalable applications Design and implement intuitive, responsive UIs that allow issuers to better understand data and analytics Build and maintain analytics and data models to enable performant and scalable products Ensure a high-quality code base by writing and reviewing performant, well-tested code Mentor junior software engineers and teammates Drive innovative improvements to team development processes Partner with Product Managers and Customer Experience Designers to develop a deep understanding of users and use cases and apply that knowledge to scoping and building new modules and features Collaborate across teams with exceptional peers who are passionate about what they do Ideal Candidate Qualifications 5+ years of full stack engineering experience in an agile production environment Experience leading the design and implementation of large, complex features in full-stack applications Ability to easily move between business, data management, and technical teams; ability to quickly intuit the business use case and identify technical solutions to enable it Experience coaching and mentoring junior teammates Experience leading a large technical effort that spans multiple people and teams Proficiency with Java/Spring Boot, .NET/C#, SQL Server or other object-oriented languages, front-end frameworks, and/or relational database technologies Some proficiency in using Python or Scala, Spark, Hadoop platforms & tools (Hive, Impala, Airflow, NiFi, Scoop), SQL to build Big Data products & platforms Some experience in building and deploying production-level data-driven applications and data processing workflows/pipelines and/or implementing machine learning systems at scale in Java, Scala, or Python Strong technologist with proven track record of learning new technologies and frameworks Customer-centric development approach Passion for analytical / quantitative problem solving Experience identifying and implementing technical improvements to development processes Collaboration skills with experience working with people across roles and geographies Motivation, creativity, self-direction, and desire to thrive on small project teams Superior academic record with a degree in Computer Science or related technical field Strong written and verbal English communication skills Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines.
Posted 5 days ago
3.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Senior Software Engineer "Overview Mastercard is a global technology company behind the world’s fastest payments processing network. We are a vehicle for commerce, a connection to financial systems for the previously excluded, a technology innovation lab, and the home of Priceless®. We ensure every employee has the opportunity to be a part of something bigger and to change lives. We believe as our company grows, so should you. We believe in connecting everyone to endless, priceless possibilities. As a Data Engineer in Data Platform and Engineering Services, you will have the opportunity to build high performance data pipelines to load into Mastercard Data Warehouse. Our Data Warehouse provides analytical capabilities to a number of business users who help different customers provide answers to their business problems through data. You will play a vital role within a rapidly growing organization, while working closely with experienced and driven engineers to solve challenging problems. Role Develop high quality, secure and scalable data pipelines using spark, Java/Scala on object storage and Hadoop Follow MasterCard Quality Assurance and Quality Control processes Leverage new technologies and approaches to innovating with increasingly large data sets Work with project team to meet scheduled due dates, while identifying emerging issues and recommending solutions for problems Perform assigned tasks and production incident independently Contribute ideas to help ensure that required standards and processes are in place and actively look for opportunities to enhance standards and improve process efficiency All About You 3 to 5 years of experience in Data Engineering and implementing multiple end-to-end DW projects in Big Data environment Experience of building data pipelines through Spark with Java/Scala on Hadoop or Object storage Experience of working with Databases like Oracle, Netezza and have strong SQL knowledge Experience of working on Nifi will be an added advantage Experience of working with APIs will be an added advantage Experience of working in Agile teams Strong analytical skills required for debugging production issues, providing root cause and implementing mitigation plan Strong communication skills - both verbal and written – and strong relationship, collaboration skills and organizational skills Ability to multi-task across multiple projects, interface with external / internal resources and provide technical leadership to junior team members Ability to be high-energy, detail-oriented, proactive and able to function under pressure in an independent environment along with a high degree of initiative and self-motivation to drive results Ability to quickly learn and implement new technologies, and perform POC to explore best solution for the problem statement Flexibility to work as a member of a matrix based diverse and geographically distributed project teams" Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines.
Posted 5 days ago
4.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Hi All, Greeting! We have hirings at Gurugram Location for following role: Hands on in SQL and its Big Data variants (Hive-QL, Snowflake ANSI, Redshift SQL) Python and Spark and one or more of its API (PySpark, Spark SQL, Scala), Bash/Shell scripting Experience with Source code control - GitHub, VSTS etc. Knowledge and exposure to Big Data technologies Hadoop stack such as HDFS, Hive, Impala, Spark etc, and cloud Big Data warehouses - RedShift, Snowflake etc. Experience with UNIX command-line tools. Exposure to AWS technologies including EMR, Glue, Athena, Data Pipeline, Lambda, etc Understanding and ability to translate/physicalise Data Models (Star Schema, Data Vault 2.0 etc) Design, develop, test, deploy, maintain and improve software Develop flowcharts, layouts and documentation to identify requirements & solutions Skill: AWS+ SQL+ Python is Mandatory Experience- 4 to 12 years NOTE: _____Face to face______ interview Happening In Gurugram office on 2nd Augt- 2025--- <<<<<<>>>>>> NOTE: WE NEED PEOPLE WHO CAN JOIN BY September MAX. Apply at rashwinder.kaur@qmail.quesscorp.com
Posted 5 days ago
7.0 years
0 Lacs
New Delhi, Delhi, India
On-site
Apply Now Gurugram, India About Agoda Agoda is an online travel booking platform for accommodations, flights, and more. We build and deploy cutting-edge technology that connects travelers with a global network of 4.7M hotels and holiday properties worldwide, plus flights, activities, and more. Based in Asia and part of Booking Holdings, our 7,100+ employees representing 95+ nationalities in 27 markets foster a work environment rich in diversity, creativity, and collaboration. We innovate through a culture of experimentation and ownership, enhancing the ability for our customers to experience the world. Our Purpose – Bridging the World Through Travel We believe travel allows people to enjoy, learn and experience more of the amazing world we live in. It brings individuals and cultures closer together, fostering empathy, understanding and happiness. We are a skillful, driven and diverse team from across the globe, united by a passion to make an impact. Harnessing our innovative technologies and strong partnerships, we aim to make travel easy and rewarding for everyone. Get to Know our Team: In Agoda’s Back End Engineering department, we build the scalable, fault-tolerant systems and APIs that host our core business logic. Our systems cover all major areas of our business: inventory and pricing, product information, customer data, communications, partner data, booking systems, payments, and more. These mission-critical systems change frequently with dozens of releases per day, so we must employ state-of-the-art CI/CD and testing techniques in order to make sure everything works without any downtime. We also ensure that our systems are self-healing, responding gracefully to extreme loads or unexpected input. In order to accomplish this, we use state-of-the-art languages like Scala and Go, data technologies like Kafka and Aerospike, and agile development practices. Most importantly though, we hire great people from all around the world and empower them to be successful. Whether it’s building new projects like Flights and Packages or reimagining our existing business, you’ll make a big impact as part of the Back End Engineering team. The Opportunity: Agoda is looking for developers to work on mission critical systems that deal with the designing and development of APIs that serve millions of user search requests a day. In this Role, you’ll get to Lead development of features, experiments, technical projects and complex systems Be a technical architect, mentor, and driver towards the right technology Continue to evolve our architecture and build better software Be a major contributor to our agile and scrum practices Get involved with software engineering and collaborate with server, other client, and infrastructure technical team members to build the best solution Constantly look for ways to improve our products, code-base and development practices Write great code and help others write great code Drive Technical decisions in the organization What You’ll Need To Succeed 7+ years’ experience under your belt developing performance-critical applications that run in a production environment using Scala, Java or C# Experience in leading projects, initiatives and/or teams, with full ownership of the systems involved Data platforms like SQL, Cassandra or Hadoop. You understand that different applications have different data requirements Good understanding of algorithms and data structures Strong coding ability You are passionate about the craft of software development and constantly work to improve your knowledge and skills Excellent verbal and written English communication skills It’s Great If You Have Experience with Scrum/Agile development methodologies Experience building large-scale distributed products Core engineering infrastructure tools like Git for source control, TeamCity for Continuous Integration and Puppet for deployment Hands-on experience working with technology like queueing systems (Kafka, RabbitMQ, ActiveMQ, MSMQ), Spark, Hadoop, NoSQL (Cassandra, MongoDB), Play framework, Akka library #india #newdelhi #Bangalore #Bengaluru #Pune #Hyderabad #Chennai #Kolkata #Lucknow #IT #ENG #4 #Mumbai #Delhi #Noida Equal Opportunity Employer At Agoda, we pride ourselves on being a company represented by people of all different backgrounds and orientations. We prioritize attracting diverse talent and cultivating an inclusive environment that encourages collaboration and innovation. Employment at Agoda is based solely on a person’s merit and qualifications. We are committed to providing equal employment opportunity regardless of sex, age, race, color, national origin, religion, marital status, pregnancy, sexual orientation, gender identity, disability, citizenship, veteran or military status, and other legally protected characteristics. We will keep your application on file so that we can consider you for future vacancies and you can always ask to have your details removed from the file. For more details please read our privacy policy. Disclaimer We do not accept any terms or conditions, nor do we recognize any agency’s representation of a candidate, from unsolicited third-party or agency submissions. If we receive unsolicited or speculative CVs, we reserve the right to contact and hire the candidate directly without any obligation to pay a recruitment fee. Copy Link Line WeChat LinkedIn Email
Posted 5 days ago
7.0 years
0 Lacs
Greater Kolkata Area
On-site
Apply Now Gurugram, India About Agoda Agoda is an online travel booking platform for accommodations, flights, and more. We build and deploy cutting-edge technology that connects travelers with a global network of 4.7M hotels and holiday properties worldwide, plus flights, activities, and more. Based in Asia and part of Booking Holdings, our 7,100+ employees representing 95+ nationalities in 27 markets foster a work environment rich in diversity, creativity, and collaboration. We innovate through a culture of experimentation and ownership, enhancing the ability for our customers to experience the world. Our Purpose – Bridging the World Through Travel We believe travel allows people to enjoy, learn and experience more of the amazing world we live in. It brings individuals and cultures closer together, fostering empathy, understanding and happiness. We are a skillful, driven and diverse team from across the globe, united by a passion to make an impact. Harnessing our innovative technologies and strong partnerships, we aim to make travel easy and rewarding for everyone. Get to Know our Team: In Agoda’s Back End Engineering department, we build the scalable, fault-tolerant systems and APIs that host our core business logic. Our systems cover all major areas of our business: inventory and pricing, product information, customer data, communications, partner data, booking systems, payments, and more. These mission-critical systems change frequently with dozens of releases per day, so we must employ state-of-the-art CI/CD and testing techniques in order to make sure everything works without any downtime. We also ensure that our systems are self-healing, responding gracefully to extreme loads or unexpected input. In order to accomplish this, we use state-of-the-art languages like Scala and Go, data technologies like Kafka and Aerospike, and agile development practices. Most importantly though, we hire great people from all around the world and empower them to be successful. Whether it’s building new projects like Flights and Packages or reimagining our existing business, you’ll make a big impact as part of the Back End Engineering team. The Opportunity: Agoda is looking for developers to work on mission critical systems that deal with the designing and development of APIs that serve millions of user search requests a day. In this Role, you’ll get to Lead development of features, experiments, technical projects and complex systems Be a technical architect, mentor, and driver towards the right technology Continue to evolve our architecture and build better software Be a major contributor to our agile and scrum practices Get involved with software engineering and collaborate with server, other client, and infrastructure technical team members to build the best solution Constantly look for ways to improve our products, code-base and development practices Write great code and help others write great code Drive Technical decisions in the organization What You’ll Need To Succeed 7+ years’ experience under your belt developing performance-critical applications that run in a production environment using Scala, Java or C# Experience in leading projects, initiatives and/or teams, with full ownership of the systems involved Data platforms like SQL, Cassandra or Hadoop. You understand that different applications have different data requirements Good understanding of algorithms and data structures Strong coding ability You are passionate about the craft of software development and constantly work to improve your knowledge and skills Excellent verbal and written English communication skills It’s Great If You Have Experience with Scrum/Agile development methodologies Experience building large-scale distributed products Core engineering infrastructure tools like Git for source control, TeamCity for Continuous Integration and Puppet for deployment Hands-on experience working with technology like queueing systems (Kafka, RabbitMQ, ActiveMQ, MSMQ), Spark, Hadoop, NoSQL (Cassandra, MongoDB), Play framework, Akka library #india #newdelhi #Bangalore #Bengaluru #Pune #Hyderabad #Chennai #Kolkata #Lucknow #IT #ENG #4 #Mumbai #Delhi #Noida Equal Opportunity Employer At Agoda, we pride ourselves on being a company represented by people of all different backgrounds and orientations. We prioritize attracting diverse talent and cultivating an inclusive environment that encourages collaboration and innovation. Employment at Agoda is based solely on a person’s merit and qualifications. We are committed to providing equal employment opportunity regardless of sex, age, race, color, national origin, religion, marital status, pregnancy, sexual orientation, gender identity, disability, citizenship, veteran or military status, and other legally protected characteristics. We will keep your application on file so that we can consider you for future vacancies and you can always ask to have your details removed from the file. For more details please read our privacy policy. Disclaimer We do not accept any terms or conditions, nor do we recognize any agency’s representation of a candidate, from unsolicited third-party or agency submissions. If we receive unsolicited or speculative CVs, we reserve the right to contact and hire the candidate directly without any obligation to pay a recruitment fee. Copy Link Line WeChat LinkedIn Email
Posted 5 days ago
8.0 - 10.0 years
20 - 35 Lacs
Ahmedabad
Remote
We are seeking a talented and experienced Senior Data Engineer to join our team and contribute to building a robust data platform on Azure Cloud. The ideal candidate will have hands-on experience designing and managing data pipelines, ensuring data quality, and leveraging cloud technologies for scalable and efficient data processing. The Data Engineer will design, develop, and maintain scalable data pipelines and systems to support the ingestion, transformation, and analysis of large datasets. The role requires a deep understanding of data workflows, cloud platforms (Azure), and strong problem-solving skills to ensure efficient and reliable data delivery. Key Responsibilities Data Ingestion and Integration: Develop and maintain data ingestion pipelines using tools like Azure Data Factory , Databricks , and Azure Event Hubs . Integrate data from various sources, including APIs, databases, file systems, and streaming data. ETL/ELT Development: Design and implement ETL/ELT workflows to transform and prepare data for analysis and storage in the data lake or data warehouse. Automate and optimize data processing workflows for performance and scalability. Data Modeling and Storage: Design data models for efficient storage and retrieval in Azure Data Lake Storage and Azure Synapse Analytics . Implement best practices for partitioning, indexing, and versioning in data lakes and warehouses. Quality Assurance: Implement data validation, monitoring, and reconciliation processes to ensure data accuracy and consistency. Troubleshoot and resolve issues in data pipelines to ensure seamless operation. Collaboration and Documentation: Work closely with data architects, analysts, and other stakeholders to understand requirements and translate them into technical solutions. Document processes, workflows, and system configurations for maintenance and onboarding purposes. Cloud Services and Infrastructure: Leverage Azure services like Azure Data Factory , Databricks , Azure Functions , and Logic Apps to create scalable and cost-effective solutions. Monitor and optimize Azure resources for performance and cost management. Security and Governance: Ensure data pipelines comply with organizational security and governance policies. Implement security protocols using Azure IAM, encryption, and Azure Key Vault. Continuous Improvement: Monitor existing pipelines and suggest improvements for better efficiency, reliability, and scalability. Stay updated on emerging technologies and recommend enhancements to the data platform. Skills Strong experience with Azure Data Factory , Databricks , and Azure Synapse Analytics . Proficiency in Python , SQL , and Spark . Hands-on experience with ETL/ELT processes and frameworks. Knowledge of data modeling, data warehousing, and data lake architectures. Familiarity with REST APIs, streaming data (Kafka, Event Hubs), and batch processing. Good To Have: Experience with tools like Azure Purview , Delta Lake , or similar governance frameworks. Understanding of CI/CD pipelines and DevOps tools like Azure DevOps or Terraform . Familiarity with data visualization tools like Power BI . Competency Analytical Thinking Clear and effective communication Time Management Team Collaboration Technical Proficiency Supervising Others Problem Solving Risk Management Organizing & Task Management Creativity/innovation Honesty/Integrity Education: Bachelors degree in Computer Science, Data Science, or a related field. 8+ years of experience in a data engineering or similar role.
Posted 5 days ago
7.0 years
0 Lacs
Lucknow, Uttar Pradesh, India
On-site
Apply Now Gurugram, India About Agoda Agoda is an online travel booking platform for accommodations, flights, and more. We build and deploy cutting-edge technology that connects travelers with a global network of 4.7M hotels and holiday properties worldwide, plus flights, activities, and more. Based in Asia and part of Booking Holdings, our 7,100+ employees representing 95+ nationalities in 27 markets foster a work environment rich in diversity, creativity, and collaboration. We innovate through a culture of experimentation and ownership, enhancing the ability for our customers to experience the world. Our Purpose – Bridging the World Through Travel We believe travel allows people to enjoy, learn and experience more of the amazing world we live in. It brings individuals and cultures closer together, fostering empathy, understanding and happiness. We are a skillful, driven and diverse team from across the globe, united by a passion to make an impact. Harnessing our innovative technologies and strong partnerships, we aim to make travel easy and rewarding for everyone. Get to Know our Team: In Agoda’s Back End Engineering department, we build the scalable, fault-tolerant systems and APIs that host our core business logic. Our systems cover all major areas of our business: inventory and pricing, product information, customer data, communications, partner data, booking systems, payments, and more. These mission-critical systems change frequently with dozens of releases per day, so we must employ state-of-the-art CI/CD and testing techniques in order to make sure everything works without any downtime. We also ensure that our systems are self-healing, responding gracefully to extreme loads or unexpected input. In order to accomplish this, we use state-of-the-art languages like Scala and Go, data technologies like Kafka and Aerospike, and agile development practices. Most importantly though, we hire great people from all around the world and empower them to be successful. Whether it’s building new projects like Flights and Packages or reimagining our existing business, you’ll make a big impact as part of the Back End Engineering team. The Opportunity: Agoda is looking for developers to work on mission critical systems that deal with the designing and development of APIs that serve millions of user search requests a day. In this Role, you’ll get to Lead development of features, experiments, technical projects and complex systems Be a technical architect, mentor, and driver towards the right technology Continue to evolve our architecture and build better software Be a major contributor to our agile and scrum practices Get involved with software engineering and collaborate with server, other client, and infrastructure technical team members to build the best solution Constantly look for ways to improve our products, code-base and development practices Write great code and help others write great code Drive Technical decisions in the organization What You’ll Need To Succeed 7+ years’ experience under your belt developing performance-critical applications that run in a production environment using Scala, Java or C# Experience in leading projects, initiatives and/or teams, with full ownership of the systems involved Data platforms like SQL, Cassandra or Hadoop. You understand that different applications have different data requirements Good understanding of algorithms and data structures Strong coding ability You are passionate about the craft of software development and constantly work to improve your knowledge and skills Excellent verbal and written English communication skills It’s Great If You Have Experience with Scrum/Agile development methodologies Experience building large-scale distributed products Core engineering infrastructure tools like Git for source control, TeamCity for Continuous Integration and Puppet for deployment Hands-on experience working with technology like queueing systems (Kafka, RabbitMQ, ActiveMQ, MSMQ), Spark, Hadoop, NoSQL (Cassandra, MongoDB), Play framework, Akka library #india #newdelhi #Bangalore #Bengaluru #Pune #Hyderabad #Chennai #Kolkata #Lucknow #IT #ENG #4 #Mumbai #Delhi #Noida Equal Opportunity Employer At Agoda, we pride ourselves on being a company represented by people of all different backgrounds and orientations. We prioritize attracting diverse talent and cultivating an inclusive environment that encourages collaboration and innovation. Employment at Agoda is based solely on a person’s merit and qualifications. We are committed to providing equal employment opportunity regardless of sex, age, race, color, national origin, religion, marital status, pregnancy, sexual orientation, gender identity, disability, citizenship, veteran or military status, and other legally protected characteristics. We will keep your application on file so that we can consider you for future vacancies and you can always ask to have your details removed from the file. For more details please read our privacy policy. Disclaimer We do not accept any terms or conditions, nor do we recognize any agency’s representation of a candidate, from unsolicited third-party or agency submissions. If we receive unsolicited or speculative CVs, we reserve the right to contact and hire the candidate directly without any obligation to pay a recruitment fee. Copy Link Line WeChat LinkedIn Email
Posted 5 days ago
15.0 - 19.0 years
0 Lacs
pune, maharashtra
On-site
Are you an analytic thinker who enjoys creating valuable insights with data Do you want to play a key role in transforming our firm into an agile organization At UBS, we re-imagine the way we work, connect with each other - our colleagues, clients, and partners - and deliver value. Being agile will make us more responsive, adaptable, and ultimately more innovative. We are looking for a Data Engineer to transform data into valuable insights that inform business decisions, utilizing our internal data platforms and applying appropriate analytical techniques. You will be responsible for engineering reliable data pipelines for sourcing, processing, distributing, and storing data in different ways, effectively using data platform infrastructure. Additionally, you will develop, train, and apply machine-learning models to make better predictions, automate manual processes, and solve challenging business problems. Ensuring the quality, security, reliability, and compliance of our solutions by applying digital principles and implementing both functional and non-functional requirements is a key aspect of this role. You will also be involved in building observability into our solutions, monitoring production health, helping to resolve incidents, and remediating the root cause of risks and issues while understanding, representing, and advocating for client needs. The WMA Data Foundational Platforms & Services Crew is the fuel for the WMA CDIO, providing the foundational, disruptive, and modern platform and technologies. The mission is rooted in the value proposition of a shared, foundational platform across UBS to maximize business value. To be successful in this role, you should have a bachelor's degree in Computer Science, Engineering, or a related field, along with 15+ years of experience in strong proficiency with Azure cloud services related to data and analytics (Azure SQL, Data Lake, Data Factory, Databricks, etc.). Experience with SQL and data modeling, as well as familiarity with NoSQL databases, is essential. Proficiency in programming languages such as Python or Scala, and knowledge of data warehousing and data lake concepts and technologies are also required. UBS, the world's largest and only truly global wealth manager, operates through four business divisions: Global Wealth Management, Personal & Corporate Banking, Asset Management, and the Investment Bank. With a presence in all major financial centers in more than 50 countries, our global reach and expertise set us apart from our competitors. At UBS, we value our people and their diverse skills, experiences, and backgrounds as the driving force behind our ongoing success. We are dedicated to our craft, passionate about putting our people first, offering new challenges, a supportive team, opportunities to grow, and flexible working options when possible. Our inclusive culture brings out the best in our employees at every stage of their career journey. Collaboration is at the heart of everything we do because together, we are more than ourselves. UBS is committed to disability inclusion, and if you need reasonable accommodation/adjustments throughout our recruitment process, you can always contact us. UBS is an Equal Opportunity Employer that respects and seeks to empower each individual, supporting diverse cultures, perspectives, skills, and experiences within our workforce.,
Posted 5 days ago
3.0 years
10 - 15 Lacs
Bengaluru, Karnataka, India
On-site
Key Responsibilities Partner with product managers, engineers, and business stakeholders to define KPIs and success metrics for Creator Success Create comprehensive dashboards and self-service analytics tools using QuickSight, Tableau, or similar BI platforms Perform deep-dive analysis on customer behavior, content performance, and livestream engagement patterns Design, build, and maintain robust ETL/ELT pipelines to process large volumes of streaming and batch data from Creator Success platform Develop and optimize data warehouses, data lakes, and real-time analytics systems using AWS services (Redshift, S3, Kinesis, EMR, Glue) Implement data quality frameworks and monitoring systems to ensure data accuracy and reliability Qualifications Bachelor's degree in Computer Science, Engineering, Mathematics, Statistics, or related quantitative field 3+ years of experience in business intelligence/analytic roles with proficiency in SQL, Python, and/or Scala Strong experience with AWS cloud services (Redshift, S3, EMR, Glue, Lambda, Kinesis) Expertise in building and optimizing ETL pipelines and data warehousing solutions Proficiency with big data technologies (Spark, Hadoop) and distributed computing frameworks Experience with business intelligence tools (QuickSight, Tableau, Looker) and data visualization best practices High proficiency in SQL and Python Skills: aws lambda,quicksight,power bi,aws s3,aws,tableau,aws kinesis,etl,sql,aws redshift,scala,aws emr,business intelligence,hadoop,spark,aws glue,data warehousing,python
Posted 5 days ago
8.0 - 12.0 years
0 Lacs
chennai, tamil nadu
On-site
The Applications Development Senior Programmer Analyst position is a vital role where you will participate in establishing and implementing new or revised application systems and programs in collaboration with the Technology team. Your main goal will be to contribute to applications systems analysis and programming activities. Your responsibilities will include conducting tasks related to feasibility studies, time and cost estimates, IT planning, risk technology, applications development, model development, and implementing new or revised applications systems and programs to meet specific business needs or user areas. You will be responsible for monitoring and controlling all phases of the development process including analysis, design, construction, testing, and implementation. Furthermore, providing user and operational support on applications to business users will also be part of your role. You will need to utilize your in-depth specialty knowledge of applications development to analyze complex problems/issues, evaluate business processes, system processes, and industry standards, and make evaluative judgments. It will be your responsibility to recommend and develop security measures in post-implementation analysis of business usage to ensure successful system design and functionality. Consulting with users/clients and other technology groups on issues, recommending advanced programming solutions, and assisting in the installation of customer exposure systems will also be part of your duties. Additionally, you will ensure that essential procedures are followed, help define operating standards and processes, and serve as an advisor or coach to new or lower-level analysts. You will be expected to operate with a limited level of direct supervision, exercise independence of judgment and autonomy, and act as a Subject Matter Expert (SME) to senior stakeholders and/or other team members. Your qualifications should include 8+ years of Development experience with expertise in Hadoop Ecosystem, Java Server-side development, Scala programming, Spark expertise, Data Analysis using SQL, a financial background, Python, Linux, proficiency in Reporting Tools like Tableau, Stakeholder Management, and a history of delivering against agreed objectives. You should also possess the ability to multitask, work under pressure, pick up new concepts and apply knowledge, demonstrate problem-solving skills, have an enthusiastic and proactive approach with a willingness to learn, and have excellent analytical and process-based skills. Ideally, you should hold a Bachelor's degree or equivalent experience. Please note that this job description provides a high-level review of the types of work performed, and other job-related duties may be assigned as required.,
Posted 5 days ago
7.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Role : Big Data Developer Location : Chennai Experience : 7+ years Work Mode : Work from Office Key Skills Required Google Cloud Platform (GCP) BigQuery (BQ) Dataflow Dataproc Cloud Spanner Strong knowledge of distributed systems, data processing frameworks, and big data architecture. Proficiency in programming languages like Python, Java, or Scala. Roles And Responsibilities BigQuery (BQ): Design and develop scalable data warehouses using BigQuery. Optimize SQL queries for performance and cost-efficiency in BigQuery. Implement data partitioning and clustering strategies. Dataflow: Build and maintain batch and streaming data pipelines using Apache Beam on GCP Dataflow. Ensure data transformation, enrichment, and cleansing as per business needs. Monitor and troubleshoot pipeline performance issues. Dataproc: Develop and manage Spark and Hadoop jobs on GCP Dataproc. Perform ETL/ELT operations using PySpark, Hive, or other tools. Automate and orchestrate jobs for scheduled data workflows. Cloud Spanner: Design and manage globally distributed, scalable transactional databases using Cloud Spanner. Optimize schema and query design for performance and reliability. Implement high availability and disaster recovery strategies. General Responsibilities: Collaborate with data architects, analysts, and business stakeholders to understand data requirements. Implement data quality and data governance best practices. Ensure security and compliance with GCP data handling standards. Participate in code reviews, CI/CD deployments, and Agile development cycles.
Posted 5 days ago
1.0 - 5.0 years
0 Lacs
chennai, tamil nadu
On-site
As a skilled Data Engineer, you will leverage your expertise to contribute to the development of data modeling, ETL processes, and reporting systems. With over 3 years of hands-on experience in areas such as ETL, Big Query, SQL, Python, or Alteryx, you will play a crucial role in enhancing data engineering processes. Your advanced knowledge of SQL programming and database management will be key in ensuring the efficiency of data operations. In this role, you will utilize your solid experience with Business Intelligence reporting tools like Power BI, Qlik Sense, Looker, or Tableau to create insightful reports and analytics. Your understanding of data warehousing concepts and best practices will enable you to design robust data solutions. Your problem-solving skills and attention to detail will be instrumental in addressing data quality issues and proposing effective BI solutions. Collaboration and communication are essential aspects of this role, as you will work closely with stakeholders to define requirements and develop data-driven insights. Your ability to work both independently and as part of a team will be crucial in ensuring the successful delivery of projects. Additionally, your proactive approach to learning new tools and techniques will help you stay ahead in a dynamic environment. Preferred skills include experience with GCP cloud services, Python, Hive, Spark, Scala, JavaScript, and various BI/reporting tools. Your strong oral, written, and interpersonal communication skills will enable you to effectively convey insights and solutions to stakeholders. A Bachelor's degree in Computer Science, Computer Information Systems, or a related field is required for this role. Overall, as a Data Engineer, you will play a vital role in developing and maintaining data pipelines, reporting systems, and dashboards. Your expertise in SQL, BI tools, and data validation will contribute to ensuring data accuracy and integrity across all systems. Your analytical mindset and ability to perform root cause analysis will be key in identifying opportunities for improvement and driving data-driven decision-making within the organization.,
Posted 5 days ago
10.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Location: Gurgaon tbo.com Office Address: Floor 22, Tower C, Epitome Building No. 5,DLF Cyber city, DLF phase 2,Gurgaon - 122002, Haryana, India TBO – Travel Boutique Online Group –(www.tbo.com) TBO is a global platform that aims to simplify all buying and selling travel needs of travel partners across the world. The proprietary technology platform aims to simplify the demands of the complex world of global travel by seamlessly connecting the highly distributed travel buyers and travel suppliers at scale. The TBO journey began in 2006 with a simple goal – to address the evolving needs of travel buyers and suppliers, and what started off as a single product air ticketing company, has today become the leading B2A (Business to Agents) travel portal across the Americas, UK & Europe, Africa, Middle East, India, and Asia Pacific. Today, TBO’s product range from air, hotels, rail, holiday packages, car rentals, transfers, sightseeing, cruise, and cargo. Apart from these products, our proprietary platform relies heavily on AI/ML to offer unique listings and products, meeting specific requirements put forth by customers, thus increasing conversions. TBO’s approach has always been technology-first and we continue to invest on new innovations and new offerings to make travel easy and simple. TBO’s travel APIs are serving large travel ecosystems across the world while the modular architecture of the platform enables new travel products while expanding across new geographies. Why TBO: • You will influence & contribute to “Building World Largest Technology Led Travel Distribution Network” for a $ 9 Trillion global travel business market. • We are the emerging leaders in technology led end-to-end travel management, in the B2B space. • Physical Presence in 47 countries with business in 110 countries. • We are reputed for our-long lasting trusted relationships. We stand by our eco system of suppliers and buyers to service the end customer. • An open & informal start-up environment which cares. What TBO offers to a Life Traveller in You: • Enhance Your Leadership Acumen. Join the journey to create global scale and ‘World Best’. • Challenge Yourself to do something path breaking. Be Empowered. The only thing to stop you will be your imagination. • As we enter the last phase of the pandemic; travel space is likely to see significant growth. Witness and shape this space. It will be one exciting journey. We are a tech-driven organization focused on leveraging data, AI, and scalable cloud infrastructure to drive impactful business decisions. We are looking for a highly skilled and experienced Head of Data Science and Engineering with a strong background in machine learning, AI , and big data architecture , ideally from a top-tier engineering Key Responsibilities: Design, develop, and maintain robust, scalable, and high-performance data pipelines and ETL processes. Architect and implement large-scale data infrastructure using tools such as Spark, Kafka, Airflow, and cloud platforms (AWS/GCP/Azure). Deploy machine learning models into production. Optimize data workflows to handle structured and unstructured data across various sources. Develop and maintain metadata management, data quality checks, and observability. Drive best practices in data engineering, data governance, and model monitoring. Mentor junior team members and contribute to strategic technology decisions. Must-Have Qualifications: 10+ years of experience in data engineering/Scientist, data architecture, or a related domain. Strong expertise in Python/Scala/Java and SQL. Proven experience with big data tools (Spark, Hadoop, Hive), streaming systems (Kafka, Flink), and workflow orchestration tools (Airflow, Prefect). Deep understanding of data modeling , data warehousing , and distributed systems . Strong exposure to ML/AI pipelines , MLOps, and model lifecycle management. Experience with cloud platforms such as AWS (S3, Redshift, Glue) , GCP (BigQuery, Dataflow) , or Azure (Data Lake, Synapse) . Graduate/Postgraduate from a premium engineering institute (IITs, NITs, BITS, etc.) . Exposure to Statistical Modeling around pricing and churn management is a plus Exposure to fine-tuning LLMs is a plus
Posted 5 days ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
We are looking for an experienced Scala Developer to join our growing engineering team. You will be responsible for developing backend components and microservices that form the core of our Digital BSS platform. This includes working on product catalog, order management, customer management, charging, and other telecom-critical domains. Responsibilities: Design and implement scalable, high-performance microservices using Scala (and Akka or Pekko). Develop RESTful APIs and event-driven systems using Kafka and Akka Streams. Participate continuous integration and agile software development practices. Maintain and enhance existing services and contribute to performance tuning. Write clean, maintainable, and testable code with a focus on reliability and observability. Write Unit tests Required Skills & Qualifications: Experience with Scala and functional programming principles. Experience with Akka (Akka HTTP, Akka Streams) or Pekko (Pekko HTTP, Pekko Streams). Good knowledge of RESTful APIs, JSON, and GraphQL (optional). Proficiency in handling JSON data using Scala-based libraries (e.g., spray-json, play-json) Proficiency in working with Kafka, gRPC, or other messaging/eventing systems. Experience with PostgreSQL or other relational databases. Familiarity with CI/CD tools like Jenkins, GitLab CI, or GitHub Actions. Experience working with Docker, Kubernetes, and cloud platforms (AWS, Azure, or GCP) is a plus. Agile development experience (Scrum, Kanban). Bachelor’s or master’s degree in computer science or related field. Nice to Have: Familiarity with telecom BSS platforms and tools (e.g., catalog management, order handling, customer management) is a strong advantage.
Posted 5 days ago
4.0 - 8.0 years
0 Lacs
hyderabad, telangana
On-site
As a GCP Developer, you will be responsible for maintaining the stability of production platforms, delivering new features, and minimizing technical debt across various technologies. You should have a minimum of 4 years of experience in the field. You must have a strong commitment to maintaining high standards and a genuine passion for ensuring quality in your work. Proficiency in GCP, Python, Hadoop, Spark, Cloud, Scala, Streaming (pub/sub), Kafka, SQL, Data Proc, and Data Flow is essential for this role. Additionally, familiarity with data warehouses, distributed data platforms, and data lakes is required. You should possess knowledge in database definition, schema design, Looker Views, and Models. An understanding of data structures and algorithms is crucial for success in this position. Experience with CI/CD practices would be advantageous. This position involves working in a dynamic environment across multiple locations such as Chennai, Hyderabad, and Bangalore. A total of 20 positions are available for qualified candidates.,
Posted 5 days ago
3.0 - 5.0 years
0 Lacs
Mumbai, Maharashtra
On-site
MS - Banking & FSMumbai Posted On 29 Jul 2025 End Date 27 Sep 2025 Required Experience 3 - 5 Years Basic Section No. Of Openings 1 Designation Test Engineer Closing Date 27 Sep 2025 Organisational MainBU Quality Engineering Sub BU MS - Banking & FS Country India Region MEA State Maharashtra City Mumbai Working Location Mumbai Client Location NA Skills Skill JAVA Highest Education No data available CERTIFICATION No data available Working Language No data available JOB DESCRIPTION Previous experience working as a QA automation engineer (2+ YoE). Advanced programming skills including test automation tools and CI/CD integration. Familiarity with programming script languages including Python and Spark. Expertise in data testing using Java/Scala, SQL, NoSQL and ETL processes. Databricks delta lake experience. Strong in database and data warehousing concepts Proficiency in Statistical procedures, Experiments and Machine Learning techniques Must have knowledge on basics of data analytics and data modelling. Ability to develop test automation frameworks. Ability to work as an individual contributor. Strong attention to detail. Familiarity with Git or other version control systems. Understanding of Agile development methodologies. Willingness to switch to manual testing whenever required. Excellent analytical skills and problem-solving skills. Detailed knowledge of application functions, bug fixing, and testing protocols. Good written and verbal communication skills. Azure data studio Run the test using Python / pyspark based frameworks Knowing Java will be an advantage"
Posted 5 days ago
0.0 - 18.0 years
0 Lacs
Chennai, Tamil Nadu
On-site
Job Information Date Opened 07/28/2025 Job Type Full time Work Experience 10-18 years Industry Technology Number of Positions 1 City Chennai State/Province Tamil Nadu Country India Zip/Postal Code 600086 About Us Why a career in Zuci is unique! Constant attention is the source of our perfection. We fundamentally believe that building a career is all about consistency. If you jog or walk for a few days, it won’t bring in big results. If you do the right things every day for hundreds of days, you'll become lighter, more flexible, and you'll start enjoying your work and life more. Our customers trust us because of our unwavering consistency. Enabling us to deliver high-quality work and thereby give our customers and Team Zuci the best shot at extraordinary outcomes. Do you see the big picture? Is Digital Engineering your forte? Job Description Solution Architect – Data & AI (GCP + AdTech Focus) Experience : 15+ Years Employment Type: Full Time Role Overview: We are seeking a highly experienced Solution Architect with deep expertise in Google Cloud Platform (GCP) and a proven track record in architecting data and AI solutions for the AdTech industry. This role will be pivotal in designing scalable, real-time, and privacy-compliant solutions for programmatic advertising, customer analytics, and AI-driven personalization. The ideal candidate should blend strong technical architecture capabilities with deep domain expertise in advertising technology and digital marketing ecosystems. Key Responsibilities: Architect and lead GCP-native data and AI solutions tailored to AdTech use cases—such as real-time bidding, campaign analytics, customer segmentation, and look alike modeling. Design high-throughput data pipelines, audience data lakes, and analytics platforms leveraging GCP services like BigQuery, Dataflow, Pub/Sub, Cloud Storage, Vertex AI, etc. Collaborate with ad operations, marketing teams, and digital product owners to understand business goals and translate them into scalable and performant solutions. Integrate with third-party AdTech and MarTech platforms, including DSPs, SSPs, CDPs, DMPs, ad exchanges, and identity resolution systems. Ensure architectural alignment with data privacy regulations (GDPR, CCPA) and support consent management and data anonymization strategies. Drive technical leadership across multi-disciplinary teams (Data Engineering, MLOps, Analytics) and enforce best practices in data governance, model deployment, and cloud optimization. Lead discovery workshops, solution assessments, and architecture reviews during pre-sales and delivery cycles. GCP & AdTech Tech Stack Expertise: BigQuery, Cloud Pub/Sub, Dataflow, Dataproc, Cloud Composer (Airflow), Vertex AI, AI Platform, AutoML, Cloud Functions, Cloud Run, Looker, Apigee, Dataplex, GKE Deep understanding of programmatic advertising (RTB, OpenRTB), cookie-less identity frameworks, and AdTech/MarTech data flows. Experience integrating or building components like: Data Management Platforms (DMPs) Customer Data Platforms (CDPs) Demand-Side Platforms (DSPs) Ad servers, attribution engines, and real-time bidding pipelines Event-driven and microservices architecture using APIs, streaming pipelines, and edge delivery networks. Integration with platforms like Google Marketing Platform, Google Ads Data Hub, Snowplow, Segment, or similar. Strong understanding of IAM, data encryption, PII anonymization, and regulatory compliance (GDPR, CCPA, HIPAA if applicable). Experience with CI/CD pipelines (Cloud Build), Infrastructure as Code (Terraform), and MLOps pipelines using Vertex AI or Kubeflow. Strong experience in Python and SQL; familiarity with Scala or Java is a plus. Experience with version control (Git), Agile delivery, and architectural documentation tools.
Posted 5 days ago
0.0 - 18.0 years
0 Lacs
Chennai, Tamil Nadu
On-site
Job Information Date Opened 07/24/2025 Job Type Full time Work Experience 10-18 years Industry IT Services Number of Positions 1 City Chennai State/Province Tamil Nadu Country India Zip/Postal Code 600001 About Us Why a career in Zuci is unique! Constant attention is the source of our perfection. We fundamentally believe that building a career is all about consistency. If you jog or walk for a few days, it won’t bring in big results. If you do the right things every day for hundreds of days, you'll become lighter, more flexible, and you'll start enjoying your work and life more. Our customers trust us because of our unwavering consistency. Enabling us to deliver high-quality work and thereby give our customers and Team Zuci the best shot at extraordinary outcomes. Do you see the big picture? Is Digital Engineering your forte? Job Description Solution Architect (Data & AI Focus) Location : Chennai/Bangalore Experience : 15+ Years Employment Type : Full Time Role Description : We are seeking a highly experienced and strategic Solution Architect with a strong focus on Data and AI. This role is critical for designing comprehensive, scalable, and robust technical solutions that integrate data engineering, business intelligence, and data science capabilities. The ideal candidate will be a thought leader in enterprise architecture, capable of translating complex business requirements into technical blueprints, guiding cross-functional teams, and ensuring the successful implementation of end-to-end data and AI solutions. Responsibilities Define the overall technical architecture for data and AI solutions, ensuring alignment with business strategy and enterprise architectural principles. Design end-to-end data pipelines, data warehouses, data lakes, and AI/ML platforms, considering scalability, security, performance, and cost-effectiveness. Provide technical leadership and guidance to Data Engineering, Business Intelligence, and Data Science teams, ensuring adherence to architectural standards and best practices. Collaborate extensively with pre-sales, sales, marketing, and other department units to understand business needs, define technical requirements, and present solution proposals to clients and internal stakeholders. Evaluate and recommend appropriate technologies, tools, and platforms (open source, commercial, cloud) for various data and AI initiatives. Identify potential technical risks and challenges, proposing mitigation strategies and ensuring solution resilience. Create detailed architectural documentation, including design specifications, data flow diagrams, and integration patterns. Stay updated with the latest architectural patterns, cloud services, and industry trends in data and AI, driving continuous improvement and innovation. Tools & Technologies Cloud Architecture : Deep expertise across at least one major cloud provider (AWS, Azure, GCP) with strong understanding of their data, analytics, and AI services. Data Platforms : Snowflake, Databricks Lakehouse, Google BigQuery, Amazon Redshift, MS Fabric Integration Patterns : API Gateways, Microservices, Event-Driven Architectures, Message Queues (Kafka, RabbitMQ, SQS, Azure Service Bus). Data Modeling : Advanced data modeling techniques (Dimensional Modeling, Data Vault, Entity-Relationship Modeling). Security & Compliance : Understanding of data security best practices, compliance regulations (GDPR, HIPAA), and cloud security frameworks. DevOps/MLOps : CI/CD pipelines, Infrastructure as Code (Terraform, CloudFormation), containerization (Docker, Kubernetes). Programming Languages: Proficiency in at least one (Python, Java, Scala) for prototyping and architectural validation. Version Control : Git.
Posted 5 days ago
6.0 - 10.0 years
0 Lacs
delhi
On-site
As a Principal Data Engineer, you will be responsible for leading our data engineering efforts by designing and building robust data pipelines, optimizing data architecture, and collaborating with cross-functional teams to drive data-driven decision-making. Your main responsibilities will include designing, building, and maintaining scalable data pipelines for large datasets, collaborating with data scientists and analysts to understand data requirements and ensure data quality, implementing data modeling techniques, and maintaining data architecture best practices. You will also be optimizing data storage and retrieval processes to improve performance and efficiency, monitoring data flow, troubleshooting data-related issues in a timely manner, and documenting data engineering processes while maintaining data dictionaries. To be successful in this role, you should have 6-8 years of experience in data engineering or a related field. You should be proficient in programming languages such as Python, Java, or Scala, have strong experience with SQL and NoSQL databases (e.g., PostgreSQL, MongoDB), and hands-on experience with big data technologies like Hadoop, Spark, or Kafka. Additionally, familiarity with cloud platforms such as AWS, Azure, or Google Cloud, knowledge of data warehousing solutions and ETL processes, experience with data modeling and data architecture principles, strong problem-solving skills, and the ability to work in a fast-paced environment are essential. If you are passionate about data engineering and have the skills and qualifications mentioned above, we encourage you to apply for this exciting opportunity to lead our data engineering efforts and make a significant impact on our data-driven decision-making processes.,
Posted 5 days ago
4.0 - 8.0 years
0 Lacs
hyderabad, telangana
On-site
Minimum 5 years of experience in Scala is required for the role of a Scala Play Framework Guru. The role entails developing, maintaining, and optimizing backend systems built on Scala Play, leveraging your expertise in Scala and functional programming principles to write clean, efficient, and maintainable code. Collaboration with the DevOps team to manage infrastructure on Google Cloud Platform is essential, along with expertise in CI/CD pipelines to streamline development processes for continuous delivery. Ensuring high availability, performance, scalability, and reliability of systems is a key responsibility. Effective collaboration with front-end developers to design and implement robust APIs and services for dynamic web applications is also required. Database management and optimization skills, particularly using MySQL, are crucial for efficient data storage and retrieval. The ideal candidate should have 4+ years of experience as a Scala developer, in-depth knowledge of Scala and the Play framework, proficiency in Java and object-oriented programming principles, and experience with GCP and cloud infrastructure management is a plus. Familiarity with CI/CD pipelines and a broad understanding of applications and functionalities are also beneficial. Strong communication and collaboration skills are necessary to work effectively with cross-functional teams, with a passion for building high-quality, scalable web applications.,
Posted 5 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough