Jobs
Interviews

12 Trino Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 - 14.0 years

0 Lacs

karnataka

On-site

As a software developer at Salesforce, you will have the opportunity to contribute lines of code that have a significant and measurable positive impact on users, the company's bottom line, and the industry. Working alongside a team of world-class engineers, you will play a key role in building breakthrough features that our customers will love, adopt, and use, all while ensuring the stability and scalability of our trusted CRM platform. Your responsibilities will include architecture, design, implementation, and testing to ensure that our products are built correctly and released with high quality. You will also have the opportunity to engage in code review, mentor junior engineers, and provide technical guidance to the team, depending on your seniority level. At Salesforce, we take pride in writing high-quality, maintainable code that enhances product stability and streamlines our processes. As a Lead Engineer, you will be tasked with building new and innovative components in a rapidly evolving market technology landscape to enhance scale and efficiency. Your role will involve developing high-quality, production-ready code that will be utilized by millions of users, making design decisions based on performance, scalability, and future expansion, and contributing to all phases of the software development life cycle. Additionally, you will work within a Hybrid Engineering model and collaborate on building efficient components in a microservices multi-tenant SaaS cloud environment. To excel in this role, you should possess mastery of multiple programming languages and platforms, have at least 10 years of software development experience, demonstrate proficiency in object-oriented programming and scripting languages such as Java, Python, Scala, C#, Go, Node.JS, and C++, and exhibit strong SQL skills with experience in relational and non-relational databases. Experience with developing SAAS applications on public cloud infrastructure such as AWS, Azure, and GCP, as well as competency in queues, locks, scheduling, event-driven architecture, workload distribution, and software development best practices, are also essential requirements. Salesforce offers a comprehensive benefits package, including well-being reimbursement, generous parental leave, adoption assistance, fertility benefits, and more. You will have access to world-class enablement and on-demand training through Trailhead.com, exposure to executive thought leaders, and regular coaching sessions with leadership. Additionally, you will have the opportunity to participate in volunteer activities and contribute to the community through Salesforce's 1:1:1 model. For further information regarding benefits and perks, please visit https://www.salesforcebenefits.com/,

Posted 20 hours ago

Apply

10.0 - 14.0 years

0 Lacs

pune, maharashtra

On-site

About the Team As a part of the DoorDash organization, you will be joining a data-driven team that values timely, accurate, and reliable data to make informed business and product decisions. Data serves as the foundation of DoorDash's success, and the Data Engineering team is responsible for building database solutions tailored to various use cases such as reporting, product analytics, marketing optimization, and financial reporting. By implementing robust data structures and data warehouse architecture, this team plays a crucial role in facilitating decision-making processes at DoorDash. Additionally, the team focuses on enhancing the developer experience by developing tools that support the organization's high-velocity demands. About the Role DoorDash is seeking a dedicated Data Engineering Manager to lead the development of enterprise-scale data solutions. In this role, you will serve as a technical expert on all aspects of data architecture, empowering data engineers, data scientists, and DoorDash partners. Your responsibilities will include fostering a culture of engineering excellence, enabling engineers to deliver reliable and flexible solutions at scale. Furthermore, you will be instrumental in building and nurturing a high-performing team, driving innovation and success in a dynamic and fast-paced environment. In this role, you will: - Lead and manage a team of data engineers, focusing on hiring, building, growing, and nurturing impactful business-focused data teams. - Drive the technical and strategic vision for embedded pods and foundational enablers to meet current and future scalability and interoperability needs. - Strive for continuous improvement of data architecture and development processes. - Balance quick wins with long-term strategy and engineering excellence, breaking down large systems into user-friendly data assets and reusable components. - Collaborate cross-functionally with stakeholders, external partners, and peer data leaders. - Utilize effective planning and execution tools to ensure short-term and long-term team and stakeholder success. - Prioritize reliability and quality as essential components of data solutions. Qualifications: - Bachelor's, Master's, or Ph.D. in Computer Science or equivalent field. - Over 10 years of experience in data engineering, data platform, or related domains. - Minimum of 2 years of hands-on management experience. - Strong communication and leadership skills, with a track record of hiring and growing teams in a fast-paced environment. - Proficiency in programming languages such as Python, Kotlin, and SQL. - Prior experience with technologies like Snowflake, Databricks, Spark, Trino, and Pinot. - Familiarity with the AWS ecosystem and large-scale batch/real-time ETL orchestration using tools like Airflow, Kafka, and Spark Streaming. - Knowledge of data lake file formats including Delta Lake, Apache Iceberg, Glue Catalog, and S3. - Proficiency in system design and experience with AI solutions in the data space. At DoorDash, we are dedicated to fostering a diverse and inclusive community within our company and beyond. We believe that innovation thrives in an environment where individuals from diverse backgrounds, experiences, and perspectives come together. We are committed to providing equal opportunities for all and creating an inclusive workplace where everyone can excel and contribute to our collective success.,

Posted 4 days ago

Apply

4.0 - 8.0 years

0 Lacs

hyderabad, telangana

On-site

As a Software Engineer, you will play a crucial role in developing various tools and features for Data and ML platforms at the organization. Your responsibilities will include working on projects related to data processing, insights portal, data observability, data lineage, model hub, and data visualization. You will have the opportunity to either create custom solutions from scratch or customize existing open-source products to meet the specific needs of the company. The ideal candidate for this role is someone who enjoys tackling challenges, solving problems creatively, excels in collaborative environments, and is capable of delivering high-quality software within tight deadlines and constraints. This position will involve the development of innovative tools and frameworks that can enhance the capabilities of third-party BI tools through APIs. To qualify for this role, you should have a minimum of 4 years of hands-on experience with programming languages such as Java, Python, or Scala. Additionally, you should possess expertise in designing and implementing scalable microservices and Rest APIs, as well as hands-on experience with SQL and NoSQL databases. Experience in building and deploying cloud-native applications on platforms like AWS, GCP, or others is essential. Proficiency in DevOps tools, containers, and Kubernetes is also required. Strong communication and interpersonal skills are necessary for effective collaboration with cross-functional teams, along with a strong sense of ownership towards project deliverables. Preferred qualifications for this role include knowledge of Big Data technologies and platforms, familiarity with distributed computing frameworks like Spark, and experience with SQL query engines such as Trino and Hive. Previous exposure to AI/ML and Data Sciences domains would be advantageous, as well as proficiency in JavaScript libraries and frameworks like React. Experience with Business Intelligence (BI) platforms like Tableau, ThoughtSpot, and Business Objects is considered a plus. In terms of education and experience, a relevant academic background coupled with hands-on experience in software development is preferred. The successful candidate should be proactive, detail-oriented, and capable of working effectively in a fast-paced environment. If you are passionate about developing cutting-edge solutions in the field of data and ML, this role offers an exciting opportunity to contribute to impactful projects and drive innovation within the organization.,

Posted 4 days ago

Apply

7.0 - 9.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Teamwork makes the stream work. Roku is changing how the world watches TV Roku is the #1 TV streaming platform in the U.S., Canada, and Mexico, and we&aposve set our sights on powering every television in the world. Roku pioneered streaming to the TV. Our mission is to be the TV streaming platform that connects the entire TV ecosystem. We connect consumers to the content they love, enable content publishers to build and monetize large audiences, and provide advertisers unique capabilities to engage consumers. From your first day at Roku, you&aposll make a valuable - and valued - contribution. We&aposre a fast-growing public company where no one is a bystander. We offer you the opportunity to delight millions of TV streamers around the world while gaining meaningful experience across a variety of disciplines. About the team Roku runs one of the largest data lakes in the world. We store over 70 PB of data, run 10+M queries per month, scan over 100 PB of data per month. Big Data team is the one responsible for building, running, and supporting the platform that makes this possible. We provide all the tools needed to acquire, generate, process, monitor, validate and access the data in the lake for both streaming data and batch. We are also responsible for generating the foundational data. The systems we provide include Scribe, Kafka, Hive, Presto, Spark, Flink, Pinot, and others. The team is actively involved in the Open Source, and we are planning to increase our engagement over time. About the Role Roku is in the process of modernizing its Big Data Platform. We are working on defining the new architecture to improve user experience, minimize the cost and increase efficiency. Are you interested in helping us build this state-of-the-art big data platform Are you an expert with Big Data Technologies Have you looked under the hood of these systems Are you interested in Open Source If you answered Yes to these questions, this role is for you! What you will be doing You will be responsible for streamlining and tuning existing Big Data systems and pipelines and building new ones. Making sure the systems run efficiently and with minimal cost is a top priority You will be making changes to the underlying systems and if an opportunity arises, you can contribute your work back into the open source You will also be responsible for supporting internal customers and on-call services for the systems we host. Making sure we provided stable environment and great user experience is another top priority for the team We are excited if you have 7+ years of production experience building big data platforms based upon Spark, Trino or equivalent Strong programming expertise in Java, Scala, Kotlin or another JVM language. A robust grasp of distributed systems concepts, algorithms, and data structures Strong familiarity with the Apache Hadoop ecosystem: Spark, Kafka, Hive/Iceberg/Delta Lake, Presto/Trino, Pinot, etc. Experience working with at least 3 of the technologies/tools mentioned here: Big Data / Hadoop, Kafka, Spark, Trino, Flink, Airflow, Druid, Hive, Iceberg, Delta Lake, Pinot, Storm etc Extensive hands-on experience with public cloud AWS or GCP BS/MS degree in CS or equivalent AI Literacy / AI growth mindset Benefits Roku is committed to offering a diverse range of benefits as part of our compensation package to support our employees and their families. Our comprehensive benefits include global access to mental health and financial wellness support and resources. Local benefits include statutory and voluntary benefits which may include healthcare (medical, dental, and vision), life, accident, disability, commuter, and retirement options (401(k)/pension). Our employees can take time off work for vacation and other personal reasons to balance their evolving work and life needs. It&aposs important to note that not every benefit is available in all locations or for every role. For details specific to your location, please consult with your recruiter. The Roku Culture Roku is a great place for people who want to work in a fast-paced environment where everyone is focused on the company&aposs success rather than their own. We try to surround ourselves with people who are great at their jobs, who are easy to work with, and who keep their egos in check. We appreciate a sense of humor. We believe a fewer number of very talented folks can do more for less cost than a larger number of less talented teams. We&aposre independent thinkers with big ideas who act boldly, move fast and accomplish extraordinary things through collaboration and trust. In short, at Roku you&aposll be part of a company that&aposs changing how the world watches TV.? We have a unique culture that we are proud of. We think of ourselves primarily as problem-solvers, which itself is a two-part idea. We come up with the solution, but the solution isn&apost real until it is built and delivered to the customer. That penchant for action gives us a pragmatic approach to innovation, one that has served us well since 2002.? To learn more about Roku, our global footprint, and how we&aposve grown, visit https://www.weareroku.com/factsheet. By providing your information, you acknowledge that you have read our Applicant Privacy Notice and authorize Roku to process your data subject to those terms. Show more Show less

Posted 5 days ago

Apply

2.0 - 6.0 years

0 Lacs

pune, maharashtra

On-site

As a BI Engineer at DoorDash, you will play a crucial role in building and scaling data models, pipelines, and self-service analytics for the Finance, Legal, and Public Relations teams. Your responsibilities will include collaborating with business partners and stakeholders to understand diverse data requirements, working with engineering and product teams to collect necessary data, and designing and implementing large-scale, high-performance data models and pipelines. You will focus on developing ETL pipelines, delivering insightful reports, and building dashboards to meet the evolving business needs of DoorDash. Additionally, you will be responsible for implementing data quality checks, conducting QA, and enhancing the reliability and scalability of ETL processes. As a BI Engineer, you will manage a portfolio of data products to ensure the delivery of high-quality and trustworthy data to support decision-making across the organization. To excel in this role, you should have at least 2 years of professional experience in Business Intelligence, Data Engineering, or a similar field. Proficiency in Python, expertise in Database fundamentals, SQL, and performance tuning, and experience with reporting and dashboarding tools like Tableau, Sigma, Looker, and Superset are essential. You should also have experience working with data platforms such as Snowflake, Trino, Databricks, PostgreSQL, and other DBMS platforms. Moreover, strong communication and documentation skills, the ability to work effectively with technical and non-technical teams, and a proactive and self-organizing approach are key traits for success in this role. You should be comfortable working in a fast-paced environment, possess strategic thinking capabilities, and have the aptitude to analyze and interpret market and consumer information. This position is based in Pune, India, and requires either local residence or willingness to relocate. Knowledge of programming languages such as Python would be a plus. If you are looking to contribute to a dynamic and fast-growing tech company like DoorDash, and are passionate about leveraging data to drive business decisions, this role offers an exciting opportunity for professional growth and impact.,

Posted 1 week ago

Apply

9.0 - 20.0 years

0 Lacs

hyderabad, telangana

On-site

Salesforce is offering immediate opportunities for software developers who are passionate about creating impactful code that benefits users, the company, and the industry. Join a team of talented engineers to develop innovative features that customers will love, while ensuring the stability and scalability of our CRM platform. The software engineer role at Salesforce involves architecture, design, implementation, and testing to deliver high-quality products. You will have the chance to engage in code review, mentor junior engineers, and provide technical guidance to the team, depending on your seniority level. We prioritize writing maintainable code that enhances product stability and efficiency. Our team values individual strengths and encourages personal growth, believing that autonomous teams lead to empowered individuals who drive success for the product, company, and customers. Responsibilities for Principal, Lead, or Senior Engineers include: - Developing new components in a rapidly evolving market to enhance scalability and efficiency - Creating high-quality code for millions of application users - Making design decisions based on performance and scalability considerations - Contributing to all phases of the software development life cycle in a Hybrid Engineering model - Building efficient components in a multi-tenant SaaS cloud environment - Providing code review, mentorship, and technical guidance to junior team members Required Skills: - Proficiency in multiple programming languages and platforms - 9 to 20 years of software development experience - Domain knowledge in CCaaS/CPaaS/UCaaS - Experience with WebRTC, SIP, and telephony layer protocols - Strong object-oriented programming and scripting language skills - Proficiency in SQL and relational/non-relational databases - Development experience with SAAS applications on public cloud infrastructure - Knowledge of queues, locks, event-driven architecture, and workload distribution - Understanding of software development best practices and leadership skills - Degree or equivalent relevant experience required Benefits & Perks: - Comprehensive benefits package including well-being reimbursement, parental leave, adoption assistance, and more - Access to on-demand training with Trailhead.com - Opportunities for exposure to executive leadership and coaching - Participation in volunteer activities and community giving initiatives For more information on benefits and perks, please visit https://www.salesforcebenefits.com/,

Posted 1 week ago

Apply

10.0 - 14.0 years

0 Lacs

hyderabad, telangana

On-site

Salesforce is looking for software developers who are eager to make a significant impact with their code for users, the company, and the industry. You will collaborate with a team of top-notch engineers to create innovative features that our customers will appreciate, adopt, and utilize, all while maintaining the stability and scalability of our trusted CRM platform. The software engineer role at Salesforce involves architecture, design, implementation, and testing to ensure the delivery of high-quality products. Your responsibilities as a Lead Engineer will include: - Developing new components in a rapidly growing technology market to enhance scalability and efficiency. - Writing high-quality, production-ready code that can cater to millions of users. - Making design decisions based on performance, scalability, and future growth. - Contributing to all stages of the software development life cycle, including design, implementation, code reviews, automation, and testing. - Building efficient components and algorithms in a microservice multi-tenant SaaS cloud environment. - Conducting code reviews, mentoring junior engineers, and offering technical guidance to the team. Required Skills: - Proficiency in multiple programming languages and platforms. - Over 10 years of experience in software development. - Profound understanding of object-oriented programming and various scripting languages such as Java, Python, Scala, C#, Go, Node.JS, and C++. - Strong SQL skills and familiarity with relational and non-relational databases like Postgres, Trino, Redshift, and MongoDB. - Experience in developing SAAS applications on public cloud infrastructures like AWS, Azure, and GCP. - Knowledge of queues, locks, scheduling, event-driven architecture, workload distribution, as well as relational and non-relational databases. - Understanding of software development best practices and demonstration of leadership skills. - Degree or equivalent relevant experience required. Experience will be assessed based on core competencies relevant to the role. Benefits & Perks: Salesforce offers a comprehensive benefits package, including well-being reimbursement, generous parental leave, adoption assistance, fertility benefits, and more. Access to world-class enablement and on-demand training through Trailhead.com. Opportunities to engage with executive thought leaders and receive regular 1:1 coaching with leadership. Participation in volunteer activities and Salesforce's 1:1:1 model for community outreach. For further information, please visit https://www.salesforcebenefits.com/.,

Posted 1 week ago

Apply

5.0 - 8.0 years

15 - 30 Lacs

Gurugram

Hybrid

We are hiring Devops Engineer Skills: Hadoop. Kafka, kubernates, Docker, Cloud platform, Experience: 5-8 Years Immediate Joiners required Share cv at chhavi@anprax.com

Posted 2 weeks ago

Apply

4.0 - 9.0 years

4 - 9 Lacs

Hyderabad, Telangana, India

On-site

Good experience in Apache Iceberg, Apache Spark, Trino Proficiency in SQL and data modeling Experience with open Data Lakehouse using Apache Iceberg Experience with Data Lakehouse architecture with Apache Iceberg and Trino Design and implement scalable Data Lakehouse solutions using Apache Iceberg and Trino to optimize data storage and query performance.

Posted 1 month ago

Apply

8.0 - 13.0 years

40 - 65 Lacs

Bengaluru

Work from Office

About the team When 5% of Indian households shop with us, its important to build resilient systems to manage millions of orders every day. We’ve done this – with zero downtime! Sounds impossible? Well, that’s the kind of Engineering muscle that has helped Meesho become the e-commerce giant that it is today. We value speed over perfection, and see failures as opportunities to become better. We’ve taken steps to inculcate a strong ‘Founder’s Mindset’ across our engineering teams, making us grow and move fast. We place special emphasis on the continuous growth of each team member - and we do this with regular 1-1s and open communication. As Engineering Manager, you will be part of self-starters who thrive on teamwork and constructive feedback. We know how to party as hard as we work! If we aren’t building unparalleled tech solutions, you can find us debating the plot points of our favourite books and games – or even gossipping over chai. So, if a day filled with building impactful solutions with a fun team sounds appealing to you, join us. About the role We are looking for a seasoned Engineering Manager well-versed with emerging technologies to join our team. As an Engineering Manager, you will ensure consistency and quality by shaping the right strategies. You will keep an eye on all engineering projects and ensure all duties are fulfilled. You will analyse other employees’ tasks and carry on collaborations effectively. You will also transform newbies into experts and build reports on the progress of all projects What you will do Design tasks for other engineers, keeping Meesho’s guidelines and standards in mind Keep a close look on various projects and monitor the progress Drive excellence in quality across the organisation and solutioning of product problems Collaborate with the sales and design teams to create new products Manage engineers and take ownership of the project while ensuring product scalability Conduct regular meetings to plan and develop reports on the progress of projects What you will need Bachelor's / Master’s in computer science At least 8+ years of professional experience At least 4+ years’ experience in managing software development teams Experience in building large-scale distributed Systems Experience in Scalable platforms Expertise in Java/Python/Go-Lang and multithreading Good understanding on Spark and internals Deep understanding of transactional and NoSQL DBs Deep understanding of Messaging systems – Kafka Good experience on cloud infrastructure - AWS preferably Ability to drive sprints and OKRs with good stakeholder management experience. Exceptional team managing skills Experience in managing a team of 4-5 junior engineers Good understanding on Streaming and real time pipelines Good understanding on Data modelling concepts, Data Quality tools Good knowledge in Business Intelligence tools Metabase, Superset, Tableau etc. Good to have knowledge - Trino, Flink, Presto, Druid, Pinot etc. Good to have knowledge - Data pipeline building

Posted 1 month ago

Apply

10.0 - 15.0 years

13 - 17 Lacs

Bengaluru

Work from Office

About the Role: We are looking for a Senior Engineering Manager with 10+ years of experience and 2 years of people management experience to help scale and modernize Myntra's data platform. The ideal candidate will have a strong background in building scalable data platforms using a combination of open-source technologies and enterprise solutions. The role demands deep technical expertise in data ingestion, processing, serving, and governance, with a strategic mindset to scale the platform 10x to meet the ever-growing data needs across the organization. This is a high-impact role requiring innovation, engineering excellence and system stability, with an opportunity to contribute to OSS projects and build data products leveraging available data assets. Key Responsibilities: Design and scale Myntra's data platform to support growing data needs across analytics, ML, and reporting. Architect and optimize streaming data ingestion pipelines using Debezium, Kafka (Confluent), Databricks Spark and Flink. Lead improvements in data processing and serving layers, leveraging Databricks Spark, Trino, and Superset. Good understanding of open table formats like Delta and Iceberg. Scale data quality frameworks to ensure data accuracy and reliability. Build data lineage tracking solutions for governance, access control, and compliance. Collaborate with engineering, analytics, and business teams to identify opportunities and build / enhance self-serve data platforms. Improve system stability, monitoring, and observability to ensure high availability of the platform. Work with open-source communities and facilitate contributing to OSS projects aligned with Myntras tech stack. Implement cost-efficient, scalable architectures for handling 10B+ daily events in a cloud environment. Management Responsibilities: Technical Guidance : This role will play the engineering lead role for teams within Myntra Data Platform. You will provide technical leadership to a team of excellent data engineers; this requires that you have the technical depth to make complex design decisions and the hands-on ability to lead by example. Execution and Delivery : You will be expected to instill and follow good software development practices and ensure timely delivery of high-quality products. You should be familiar with agile practices as well as be able to adapt these to the needs of the business, with a constant focus on product quality. Team management : You will be responsible for hiring and mentoring your team; helping individuals grow in their careers, having constant dialogue about their aspirations and sharing prompt, clear and actionable feedback about performance. Qualifications: Education: Bachelor's or Masters degree in Computer Science, Information Systems, or a related field. Experience: 10+ years of experience in building large-scale data platforms. 2+ years of people management experience. Expertise in big data architectures using Databricks, Trino, and Debezium. Strong experience with streaming platforms, including Confluent Kafka. Experience in data ingestion, storage, processing, and serving in a cloud-based environment. Experience implementing data quality checks using Great Expectations. Deep understanding of data lineage, metadata management, and governance practices. Strong knowledge of query optimization, cost efficiency, and scaling architectures. Familiarity with OSS contributions and keeping up with industry trends in data engineering. Soft Skills: Strong analytical and problem-solving skills with a pragmatic approach to technical challenges. Excellent communication and collaboration skills to work effectively with cross-functional teams. Ability to lead large-scale projects in a fast-paced, dynamic environment. Passion for continuous learning, open-source collaboration, and building best-in-class data products.

Posted 2 months ago

Apply

4.0 - 9.0 years

10 - 20 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

JD: • Good experience in Apache Iceberg, Apache Spark, Trino • Proficiency in SQL and data modeling • Experience with open Data Lakehouse using Apache Iceberg • Experience with Data Lakehouse architecture with Apache Iceberg and Trino

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies