Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
7.0 - 12.0 years
5 - 9 Lacs
Pune
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Scala Good to have skills : Java Enterprise Edition, Java Full Stack Development, .Net Full Stack DevelopmentMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be involved in designing, building, and configuring applications to meet business process and application requirements in Mumbai. You will collaborate with teams to ensure successful project delivery and contribute to key decisions. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the development and implementation of scalable applications- Conduct code reviews and provide technical guidance to team members- Stay updated with industry trends and technologies to enhance application development Professional & Technical Skills: - Must To Have Skills: Proficiency in Scala- Good To Have Skills: Experience with Java Enterprise Edition- Strong understanding of software development principles- Hands-on experience in building and optimizing applications- Knowledge of database management systems- Familiarity with agile methodologies Additional Information:- The candidate should have a minimum of 7.5 years of experience in Scala- This position is based at our Mumbai office- A 15 years full-time education is required Qualification 15 years full time education
Posted 6 days ago
4.0 - 9.0 years
0 - 2 Lacs
Pune
Hybrid
Role & responsibilities Experience Range : 4 to 8 Years. Work Location : Pune. Candidate should have : Strong Scala Skills (with a focus on the functional programming paradigm) Solid grasp of software development methods Experience in Designing, creating and maintaining Scala-based applications Utilize Scala to create, update, and analyse different specifications of Scala-based applications for multiple systems, providing easy-to-use and better UIs. Should have a basic understanding of how data is organized in HBase, Hive and Oracle Familiarity with CI/CD build pipelines and toolchain Git, Bitbucket, TeamCity, Artifactory, Jira Person Should have worked in Agile/DevOps Environment
Posted 6 days ago
5.0 - 10.0 years
5 - 9 Lacs
Pune
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Scala Good to have skills : Java Enterprise Edition, Java Full Stack Development, .Net Full Stack DevelopmentMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements in a dynamic work environment. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Lead the development and implementation of software solutions.- Conduct code reviews and ensure adherence to coding standards.- Troubleshoot and resolve complex technical issues.- Stay updated with the latest technologies and trends in software development. Professional & Technical Skills: - Must To Have Skills: Proficiency in Scala.- Good To Have Skills: Experience with Java Enterprise Edition, Java Full Stack Development, .Net Full Stack Development.- Strong understanding of functional programming concepts.- Experience in building scalable and high-performance applications.- Knowledge of software development best practices and design patterns. Additional Information:- The candidate should have a minimum of 5 years of experience in Scala.- This position is based at our Mumbai office.- A 15 years full-time education is required. Qualification 15 years full time education
Posted 6 days ago
3.0 - 8.0 years
5 - 9 Lacs
Pune
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Scala Good to have skills : Java Enterprise Edition, Java Full Stack DevelopmentMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will play a crucial role in developing innovative solutions to enhance business operations and user experience. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to analyze business requirements and translate them into technical solutions.- Develop and maintain scalable and efficient code for applications.- Conduct code reviews and provide constructive feedback to team members.- Troubleshoot and debug applications to ensure optimal performance.- Stay updated on industry trends and best practices to enhance application development processes. Professional & Technical Skills: - Must To Have Skills: Proficiency in Scala.- Good To Have Skills: Experience with Java Enterprise Edition, Java Full Stack Development.- Strong understanding of functional programming concepts in Scala.- Experience in building and optimizing high-performance applications using Scala.- Knowledge of software development lifecycle and agile methodologies. Additional Information:- The candidate should have a minimum of 3 years of experience in Scala.- This position is based at our Mumbai office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 6 days ago
8.0 - 13.0 years
15 - 19 Lacs
Coimbatore
Work from Office
Project Role : Technology Architect Project Role Description : Review and integrate all application requirements, including functional, security, integration, performance, quality and operations requirements. Review and integrate the technical architecture requirements. Provide input into final decisions regarding hardware, network products, system software and security. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Cloud Data ArchitectureMinimum 7.5 year(s) of experience is required Educational Qualification : BE or MCA Summary :As a Technology Architect, you will be responsible for reviewing and integrating all application requirements, including functional, security, integration, performance, quality, and operations requirements. Your typical day will involve reviewing and integrating technical architecture requirements, providing input into final decisions regarding hardware, network products, system software, and security, and utilizing Databricks Unified Data Analytics Platform to deliver impactful data-driven solutions. Roles & ResponsibilitiesShould have a minimum of 8 years of experience in Databricks Unified Data Analytics Platform. Should have strong educational background in technology and information architectures, along with a proven track record of delivering impactful data-driven solutions. Strong requirement analysis and technical solutioning skill in Data and Analytics Client facing role in terms of running solution workshops, client visits, handled large RFP pursuits and managed multiple stakeholders. Technical Experience10 or more years of experience in implementing data ingestion pipelines from multiple sources and creating end to end data pipeline on Databricks platform. 3 or more years of experience using Python, PySpark or Scala. Experience in Databricks on cloud. Exp in any of AWS, Azure or GCPe, ETL, data engineering, data cleansing and insertion into a data warehouse Must have Skills like Databricks, Cloud Data Architecture, Python Programming Language, Data Engineering. Professional AttributesExcellent writing, communication and presentation skills. Eagerness to learn and develop self on an ongoing basis. Excellent client facing and interpersonal skills. Qualification BE or MCA
Posted 6 days ago
7.0 - 11.0 years
9 - 13 Lacs
Hyderabad
Work from Office
Project Description min 50 words High proficiency and 8 10 years of experience in designing/developing data analytics and data warehouse solutions with Python and Azure Data Factory (ADF) and Azure Data Bricks. He/She tends to test code manually and does not utilize automated testing frameworks for Python and PySpark. While he has a foundational understanding of Spark architecture and the Spark execution model, he has limited experience in optimizing code based on Spark Monitoring features. Experience in designing large data distribution, integration with service-oriented architecture and/or data warehouse solutions, Data Lake solution using Azure Databricks with large and multi-format data Ability to translate working solution into implementable working package using Azure platform Good understanding on Azure storage Gen2 Hands on experience with Azure stack (minimum 5 years) + Azure Databricks + Azure Data Factory Proficient coding experience using Spark (Scala/Python), T-SQL Understanding around the services related to Azure Analytics, Azure SQL, Azure function app, logic app Should be able to demonstrate a constant and quick learning ability and to handle pressure situations without compromising on quality + Power BI Report development using PBI Analysis of SSRS reports. PBI data modelling experience is advantage. Work involved in report development and as well as migration od SSRS report. Must Have: [Power BI (cloud SaaS) / Paginated Report Builder / Power Query / Data modeling] Strong SQL scripting is required Well organized and able to manage multiple projects in a fast-paced demanding environment. Attention to detail and quality; excellent problem solving and communication skills. Ability and willingness to learn new tools and applications.
Posted 6 days ago
15.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Organizations everywhere struggle under the crushing costs and complexities of “solutions” that promise to simplify their lives. To create a better experience for their customers and employees. To help them grow. Software is a choice that can make or break a business. Create better or worse experiences. Propel or throttle growth. Business software has become a blocker instead of ways to get work done. There’s another option. Freshworks. With a fresh vision for how the world works. At Freshworks, we build uncomplicated service software that delivers exceptional customer and employee experiences. Our enterprise-grade solutions are powerful, yet easy to use, and quick to deliver results. Our people-first approach to AI eliminates friction, making employees more effective and organizations more productive. Over 72,000 companies, including Bridgestone, New Balance, Nucor, S&P Global, and Sony Music, trust Freshworks’ customer experience (CX) and employee experience (EX) software to fuel customer loyalty and service efficiency. And, over 4,500 Freshworks employees make this possible, all around the world. Fresh vision. Real impact. Come build it with us. Job Description As a member of the Data Platform team, you'll be at the forefront of transforming how Freshworks Datalake can harnessed to the fullest in making data-driven decisions Key job responsibilities: Drive the backbone of our data platform by building robust pipelines that turn complex data into actionable insights using AWS, Databricks platform Be a data detective by ensuring our data is clean, accurate, and trustworthy Write clean, efficient code that handles massive amounts of structured and unstructured data Qualifications Must be proficient in the major languages such as Scala and Kafka (any variant). Write elegant and maintainable code, and you need to be comfortable with picking up new technologies. Proficient in working with distributed systems and have experience with different distributed processing frameworks that can handle data in batch and near real-time e.g. Spark etc. Experience on working with various AWS services and Databricks to build end-to-end data solutions that bring different systems together Requires 8–15 years of experience in a related field. Additional Information At Freshworks, we are creating a global workplace that enables everyone to find their true potential, purpose, and passion irrespective of their background, gender, race, sexual orientation, religion and ethnicity. We are committed to providing equal opportunity for all and believe that diversity in the workplace creates a more vibrant, richer work environment that advances the goals of our employees, communities and the business. Show more Show less
Posted 6 days ago
10.0 years
0 Lacs
New Delhi, Delhi, India
On-site
About Agoda Agoda is an online travel booking platform for accommodations, flights, and more. We build and deploy cutting-edge technology that connects travelers with a global network of 4.7M hotels and holiday properties worldwide, plus flights, activities, and more . Based in Asia and part of Booking Holdings, our 7,100+ employees representing 95+ nationalities in 27 markets foster a work environment rich in diversity, creativity, and collaboration. We innovate through a culture of experimentation and ownership, enhancing the ability for our customers to experience the world. Our Purpose – Bridging the World Through Travel We believe travel allows people to enjoy, learn and experience more of the amazing world we live in. It brings individuals and cultures closer together, fostering empathy, understanding and happiness. We are a skillful, driven and diverse team from across the globe, united by a passion to make an impact. Harnessing our innovative technologies and strong partnerships, we aim to make travel easy and rewarding for everyone. Get to Know our Team In Agoda’s Back End Engineering department, we build the scalable, fault-tolerant systems and APIs that host our core business logic. Our systems cover all major areas of our business: inventory and pricing, product information, customer data, communications, partner data, booking systems, payments, and more. These mission-critical systems change frequently with dozens of releases per day, so we must employ state-of-the-art CI/CD and testing techniques in order to make sure everything works without any downtime. We also ensure that our systems are self-healing, responding gracefully to extreme loads or unexpected input. In order to accomplish this, we use state-of-the-art languages like Scala and Go, data technologies like Kafka and Aerospike, and agile development practices. Most importantly though, we hire great people from all around the world and empower them to be successful. Whether it’s building new projects like Flights and Packages or reimagining our existing business, you’ll make a big impact as part of the Back End Engineering team. The Opportunity Agoda Platform team is looking for developers to work on mission-critical systems that serve millions of users daily. You will have the chance to work on innovative projects, using cutting-edge technologies, and make a significant impact on our business and the travel industry. In this Role, you’ll get to: Architecting and developing highly scalable, mission critical back end systems Owning a big chunk of Agoda’s system all the way from the north-star & vision down to the bytecode level Enabling impactful collaboration and cross-team projects on big projects, making a dent in the quality of our services, code & architecture Providing thoughtful feedback, nurture an inclusive engineering environment, and champion engineering fundamentals. Bring out the best from your fellow engineers Identifying and implementing opportunities for optimization across the technology stack, focusing on cost, efficiency, velocity & dev happiness Exhibiting technical leadership throughout the broader organization, conveying complex technical trade-offs to non-techies such as business owners & C-suite executives What you’ll Need to Succeed: Overall experience of 10+ years in software engineering roles Proven hands-on experience, Experience with owning production services with significant impact on design, development, deployment, monitoring & evolution Curiosity, staying a-breast on technological improvements and open source advancement Strong programming skills in languages such as Kotlin, Scala, Java, C#. Can perform deep research & take decisions on complex projects. Can easily toggle between running as a lone wolf and also works great as part of a pack Strong communication skills, with the ability to explain complex technical details to stakeholders at all levels. On top of being an expert in Back End, has understanding of challenges and trade-offs in the entire engineering universe, from Front End/ Mobile to Data 7 analytics Bachelor’s degree in Computer Science, Engineering, or a related field. It’s Great if you have: Master’s or Ph.D. in a technical field Experience with Kubernetes for effective container orchestration and scaling Deep understanding of CI/CD pipelines, automation tools, and practices relevant to machine learning. Experience in programming with Rust, C or other low level language This position is based in Bangkok, Thailand. (Relocation support is provided) #Toronto #london #seattle #Texas #singapore #toronto #boston #chicago #sydney #telaviv #atlanta #dallas #Bengaluru #hyderabad #pune #noida #chennai #gurgaon #newdelhi #mumbai #jakarta #IT #ENG #4 #5 Equal Opportunity Employer At Agoda, we pride ourselves on being a company represented by people of all different backgrounds and orientations. We prioritize attracting diverse talent and cultivating an inclusive environment that encourages collaboration and innovation. Employment at Agoda is based solely on a person’s merit and qualifications. We are committed to providing equal employment opportunity regardless of sex, age, race, color, national origin, religion, marital status, pregnancy, sexual orientation, gender identity, disability, citizenship, veteran or military status, and other legally protected characteristics. We will keep your application on file so that we can consider you for future vacancies and you can always ask to have your details removed from the file. For more details please read our privacy policy . Disclaimer We do not accept any terms or conditions, nor do we recognize any agency’s representation of a candidate, from unsolicited third-party or agency submissions. If we receive unsolicited or speculative CVs, we reserve the right to contact and hire the candidate directly without any obligation to pay a recruitment fee. Show more Show less
Posted 6 days ago
3.0 - 5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About The Position Utilizes software engineering principles to deploy and maintain fully automated data transformation pipelines that combine a large variety of storage and computation technologies to handle a distribution of data types and volumes in support of data architecture design. A Data Engineer designs data products and data pipelines that are resilient to change, modular, flexible, scalable, reusable, and cost effective. Key Responsibilities Design, develop, and maintain data pipelines and ETL processes using Microsoft Azure services (e.g., Azure Data Factory, Azure Synapse, Azure Databricks, Azure Fabric). Utilize Azure data storage accounts for organizing and maintaining data pipeline outputs. (e.g., Azure Data Lake Storage Gen 2 & Azure Blob storage). Collaborate with data scientists, data analysts, data architects and other stakeholders to understand data requirements and deliver high-quality data solutions. Optimize data pipelines in the Azure environment for performance, scalability, and reliability. Ensure data quality and integrity through data validation techniques and frameworks. Develop and maintain documentation for data processes, configurations, and best practices. Monitor and troubleshoot data pipeline issues to ensure timely resolution. Stay current with industry trends and emerging technologies to ensure our data solutions remain cutting-edge. Manage the CI/CD process for deploying and maintaining data solutions. Required Qualifications Bachelor’s degree in Computer Science, Engineering, or a related field (or equivalent experience) and able to demonstrate high proficiency in programming fundamentals. 3-5 years experience At least 2 years of proven experience as a Data Engineer or similar role dealing with data and ETL processes. Strong knowledge of Microsoft Azure services, including Azure Data Factory, Azure Synapse, Azure Databricks, Azure Blob Storage and Azure Data Lake Gen 2. Experience utilizing SQL DML to query modern RDBMS in an efficient manner (e.g., SQL Server, PostgreSQL). Strong understanding of Software Engineering principles and how they apply to Data Engineering (e.g., CI/CD, version control, testing). Experience with big data technologies (e.g., Spark). Strong problem-solving skills and attention to detail. Excellent communication and collaboration skills. Preferred Qualifications Learning agility Technical Leadership Consulting and managing business needs Strong experience in Python is preferred but experience in other languages such as Scala, Java, C#, etc is accepted. Experience building spark applications utilizing PySpark. Experience with file formats such as Parquet, Delta, Avro. Experience efficiently querying API endpoints as a data source. Understanding of the Azure environment and related services such as subscriptions, resource groups, etc. Understanding of Git workflows in software development. Using Azure DevOps pipeline and repositories to deploy and maintain solutions. Understanding of Ansible and how to use it in Azure DevOps pipelines. Chevron ENGINE supports global operations, supporting business requirements across the world. Accordingly, the work hours for employees will be aligned to support business requirements. The standard work week will be Monday to Friday. Working hours are 8:00am to 5:00pm or 1.30pm to 10.30pm. Chevron participates in E-Verify in certain locations as required by law. Show more Show less
Posted 6 days ago
10.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About Us: Marriott International Inc. , headquartered in Bethesda, Maryland, USA, was founded in May 1927 by J. Willard Marriott and Alice S. Marriott with a modest nine-seat A&W root beer stand. Guided by the family's leadership and core principles, Marriott International today has grown into a global hospitality giant, operating approximately 9,000 properties and over 30 leading brands in more than 140 countries and territories. From such humble beginnings to becoming the world’s largest hotel company, Marriott International has never stopped searching for inventive ways to serve its customers, provide opportunities for its associates, and grow their business. At Marriott Tech Accelerator center (MTA), Hyderabad, India, Marriott is exploring the world we live in and all its possibilities. At Marriott Tech Accelerator, we are a team of passionate engineering minds dedicated to creating and building cutting-edge solutions that streamline operations and elevate guest experiences. Marriott Tech Accelerator center is fully owned and operated by ANSR. All associates at Marriott Tech Accelerator will be ANSR employees, delivering services exclusively to ANSR's client, Marriott International. Job Summary: Marriott International is seeking a Software Engineering Manager to lead a high-performing team within our Revenue Management and Pricing Technology organization. In this role, you will be responsible for managing the delivery of scalable, cloud-native applications that support critical business functions. You’ll guide the team in full-stack development using technologies such as Java, Spring Boot, React, AWS, Docker, PostgreSQL, and Couchbase , while also playing a key role in shaping our platform modernization strategy. This position is ideal for a seasoned engineering leader with 8–10 years of experience , including prior experience managing engineers , driving technical execution, and fostering a culture of growth, collaboration, and accountability. Key Responsibilities: Lead, coach, and develop a team of software engineers, providing regular feedback, career guidance, and performance evaluations. Own the delivery of complex software projects from planning through execution, ensuring high quality and timely outcomes. Collaborate with product managers, architects, and business stakeholders to define technical direction and align engineering efforts with strategic goals. Foster a culture of engineering excellence through mentorship, technical leadership, and continuous improvement. Drive adoption of best practices in software development, including test automation, CI/CD, agile methodologies, and secure coding. Support team members in resolving technical challenges and removing roadblocks to productivity. Champion diversity, inclusion, and psychological safety within the team. Manage resource planning, hiring, and onboarding to support team growth and project needs. Ensure operational excellence by partnering with DevOps and infrastructure teams on deployment, monitoring, and incident response. Required Qualifications: Bachelor’s degree in Computer Science, Information Technology, or a related field (or equivalent experience). 8–10 years of professional software development experience, including: Strong background in Java/JEE, Spring Boot, and microservices architecture. Experience with front-end frameworks such as ReactJS. Hands-on experience with cloud platforms (preferably AWS). Familiarity with SQL and NoSQL databases (e.g., PostgreSQL, Couchbase, DocumentDB). Experience with containerization and orchestration (Docker, Kubernetes/OpenShift). Working experience of building distributed systems using Pub/Sub, streaming platform like Kafka. Solid understanding of DevOps tools and practices (Git, Harness/Jenkins, JIRA, CI/CD pipelines). 2+ years of experience in a formal engineering management or team lead role. Proven ability to lead and grow engineering teams, manage performance, and support career development. Strong project management skills, with experience in agile delivery and cross-functional collaboration. Excellent communication skills, with the ability to influence and align diverse stakeholders. Preferred Qualifications: Experience with enterprise integration patterns (e.g., Apache Camel). Exposure to distributed systems and concurrency frameworks (e.g., Akka). Familiarity with big data tools and batch processing (e.g., Spark, Scala, Oozie, EMR). Background in travel, hospitality, or pricing systems. Experience contributing to architectural decisions and long-term technical strategy. Show more Show less
Posted 6 days ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Welcome to Warner Bros. Discovery… the stuff dreams are made of. Who We Are… When we say, “the stuff dreams are made of,” we’re not just referring to the world of wizards, dragons and superheroes, or even to the wonders of Planet Earth. Behind WBD’s vast portfolio of iconic content and beloved brands, are the storytellers bringing our characters to life, the creators bringing them to your living rooms and the dreamers creating what’s next… From brilliant creatives, to technology trailblazers, across the globe, WBD offers career defining opportunities, thoughtfully curated benefits, and the tools to explore and grow into your best selves. Here you are supported, here you are celebrated, here you can thrive. Analytics Engineer II - Hyderabad, India . About Warner Bros. Discovery Warner Bros. Discovery, a premier global media and entertainment company, offers audiences the world's most differentiated and complete portfolio of content, brands and franchises across television, film, streaming and gaming. The new company combines Warner Media’s premium entertainment, sports and news assets with Discovery's leading non-fiction and international entertainment and sports businesses. For more information, please visit www.wbd.com. Roles & Responsibilities As a Analytics Engineer II, you will perform data analytics and data visualization-related efforts for the Data & Analytics organization at WBD. You’re an engineer who not only understands how to use big data in answering complex business questions but also how to design semantic layers to best support self-service vehicles. You will manage projects from requirements gathering to planning to implementation of full-stack data solutions (pipelines to data tables to visualizations) with the support of the larger team. You will work closely with cross-functional partners to ensure that business logic is properly represented in the semantic layer and production environments, where it can be used by the wider Data & Analytics team to drive business insights and strategy. Design and implement data models that support flexible querying and data visualization Partner with stakeholders to understand business questions and build out advanced analytical solutions Advance automation efforts that help the team spend less time manipulating & validating data and more time analyzing Build frameworks that multiply the productivity of the team and are intuitive for other data teams to leverage Participate in the creation and support of analytics development standards and best practices Create systematic solutions for solving data anomalies: identifying, alerting, and root cause analysis Work proactively with stakeholders to understand the business need and build data analytics capabilities – especially in large enterprise use cases Identify and explore new opportunities through creative analytical and engineering methods What To Bring Bachelor's degree, MS or greater in a quantitative field of study (Computer/Data Science, Engineering, Mathematics, Statistics, etc.) 3+ years of relevant experience in business intelligence/data engineering Expertise in writing SQL (clean, fast code is a must) and in data-warehousing concepts such as star schemas, slowly changing dimensions, ELT/ETL, and MPP databases Experience in transforming flawed/changing data into consistent, trustworthy datasets Experience with general-purpose programming (e.g. Python, Scala or other) dealing with a variety of data structures, algorithms, and serialization formats will be a plus Experience with big-data technologies (e.g. Spark, Hadoop, Snowflake etc) Advanced ability to build reports and dashboards with BI tools (such as Looker, Tableau or PowerBI) Experience with analytics tools such as Athena, Redshift, BigQuery or Snowflake Proficiency with Git (or similar version control) and CI/CD best practices will be a plus Ability to write clear, concise documentation and to communicate generally with a high degree of precision. Ability to solve ambiguous problems independently Ability to manage multiple projects and time constraints simultaneously Care for the quality of the input data and how the processed data is ultimately interpreted and used Prior experience in large enterprise use cases such as Sales Analytics, Financial Analytics, or Marketing Analytics Strong written and verbal communication skills Characteristics & Traits Naturally inquisitive, critical thinker, proactive problem-solver, and detail-oriented Positive attitude and an open mind Strong organizational skills with the ability to act independently and responsibly Self-starter, comfortable initiating projects from design to execution with minimal supervision Ability to manage and balance multiple (and sometimes competing) priorities in a fast-paced, complex business environment and can manage time effectively to consistently meet deadlines Team player and relationship builder What We Offer A Great Place to work. Equal opportunity employer Fast track growth opportunities How We Get Things Done… This last bit is probably the most important! Here at WBD, our guiding principles are the core values by which we operate and are central to how we get things done. You can find them at www.wbd.com/guiding-principles/ along with some insights from the team on what they mean and how they show up in their day to day. We hope they resonate with you and look forward to discussing them during your interview. Championing Inclusion at WBD Warner Bros. Discovery embraces the opportunity to build a workforce that reflects a wide array of perspectives, backgrounds and experiences. Being an equal opportunity employer means that we take seriously our responsibility to consider qualified candidates on the basis of merit, regardless of sex, gender identity, ethnicity, age, sexual orientation, religion or belief, marital status, pregnancy, parenthood, disability or any other category protected by law. If you’re a qualified candidate with a disability and you require adjustments or accommodations during the job application and/or recruitment process, please visit our accessibility page for instructions to submit your request. Show more Show less
Posted 6 days ago
8.0 years
0 Lacs
India
Remote
Who We Are At Twilio, we’re shaping the future of communications, all from the comfort of our homes. We deliver innovative solutions to hundreds of thousands of businesses and empower millions of developers worldwide to craft personalized customer experiences. Our dedication to remote-first work, and strong culture of connection and global inclusion means that no matter your location, you’re part of a vibrant team with diverse experiences making a global impact each day. As we continue to revolutionize how the world interacts, we’re acquiring new skills and experiences that make work feel truly rewarding. Your career at Twilio is in your hands. See yourself at Twilio. Join the team as our next Staff Software Engineer, Warehouse Activation About The Job This position is needed to lead in a technical capacity a team focused on Twilio Segment Warehouse Activation. As a Staff Software Engineer in the Warehouse Activation group, you’ll build and scale systems that process 1B+ rows per month, helping our customers unlock value of our Customer Data Platform (CDP). The team you will help lead will have a deep understanding of large distributed systems and data processing at scale. In addition, the systems you manage connect directly to customer Data Warehouses such as Snowflake and Databricks. In your role you will be responsible for understanding Data Warehouse APIs and how to best integrate with these systems at high scale. We iterate quickly on these products and features and learn new things daily — all while writing quality code. We work closely with product and design and solve some of the toughest engineering problems to unlock new possibilities for our customers. If you get excited by building products with high customer impact — this is the place for you. Responsibilities In this role, you will: Design and build the next generation of Warehouse Activation platform, process billions of events, and power various use cases for Twilio Segment customers. This encompasses working on stream data processing, storage, and other mission-critical systems. Ship features that opt for high availability and throughput with eventual consistency Collaborate with engineering and product leads, as well as teams across Twilio Segment Support the reliability and security of the platform Build and optimize globally available and highly scalable distributed systems Be able to act as a team Tech Lead as needed Mentor other engineers on the team in technical architecture and design Partner with application teams to deliver end to end customer success. Qualifications Not all applicants will have skills that match a job description exactly. Twilio values diverse experiences in other industries, and we encourage everyone who meets the required qualifications to apply. While having “desired” qualifications make for a strong candidate, we encourage applicants with alternative experiences to also apply. If your career is just starting or hasn't followed a traditional path, don't let that stop you from considering Twilio. We are always looking for people who will bring something new to the table! Required 8+ years of experience writing production-grade code in a modern programming language Strong theoretical fundamentals and hands-on experience designing and implementing highly available and performant fault-tolerant distributed systems. Experience programming in one or more of the following: Go, Java, Scala, or similar languages Well-versed in concurrent programming, along with a solid grasp of Linux systems and networking concepts. Experience operating large-scale, distributed systems on top of cloud infrastructure such as Amazon Web Services (AWS) or Google Cloud Platform (GCP) Experience in message passing systems (e.g., Kafka, AWS Kinesis) and/or modern stream processing systems (e.g., Spark, Flink). Have hands-on experience with container orchestration frameworks (e.g. Kubernetes, EKS, ECS) Experience shipping services (products) following CI/CD development paradigm. Deep understanding of architectural patterns of high-scale web applications (e.g., well-designed APIs, high volume data pipelines, efficient algorithms) Ideally domain expertise in the Modern Data stack with experience in developing cloud-based data solution components and architecture covering data ingestion, data processing and data storage Have a track record of successfully leading teams, large projects, or owned and built an important, complex system end to end, delivered iteratively. Excellent written and verbal technical communication skills to convey complex technical concepts effectively. Location : This role will be remote and based in India(Karnataka, Maharashtra, New Delhi, Tamil nadu and Telangana) What We Offer There are many benefits to working at Twilio, including things like competitive pay, generous time-off, ample parental and wellness leave, healthcare, a retirement savings program, and much more. Offerings vary by location. Twilio thinks big. Do you? We like to solve problems, take initiative, pitch in when needed, and are always up for trying new things. That's why we seek out colleagues who embody our values — something we call Twilio Magic. Additionally, we empower employees to build positive change in their communities by supporting their volunteering and donation efforts. So, if you're ready to unleash your full potential, do your best work, and be the best version of yourself, apply now! If this role isn't what you're looking for, please consider other open positions. Twilio is proud to be an equal opportunity employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, reproductive health decisions, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, genetic information, political views or activity, or other applicable legally protected characteristics. We also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law. Qualified applicants with arrest or conviction records will be considered for employment in accordance with the Los Angeles County Fair Chance Ordinance for Employers and the California Fair Chance Act. Additionally, Twilio participates in the E-Verify program in certain locations, as required by law. Show more Show less
Posted 6 days ago
5.0 years
0 Lacs
India
Remote
Who We Are At Twilio, we’re shaping the future of communications, all from the comfort of our homes. We deliver innovative solutions to hundreds of thousands of businesses and empower millions of developers worldwide to craft personalized customer experiences. Our dedication to remote-first work, and strong culture of connection and global inclusion means that no matter your location, you’re part of a vibrant team with diverse experiences making a global impact each day. As we continue to revolutionize how the world interacts, we’re acquiring new skills and experiences that make work feel truly rewarding. Your career at Twilio is in your hands. See yourself at Twilio. Join the team as our next Software Engineer(L3), Warehouse Activation. About The Job This position is needed to manage and grow a team focused on Twilio Segment Warehouse Activation. As a Software Engineer(L3) in the Warehouse Activation group, you’ll build and scale systems that process 1B+ rows per month, helping our customers unlock value of our customer data platform (CDP). The team you will contribute tohelp lead will have a deep understanding of large distributed systems and data processing at scale. In addition, the systems you manage connect directly to customer Data Warehouses such as Snowflake and Databricks. In your role you will be responsible for understanding Data Warehouse APIs and how to best integrate with these systems at high scale. We iterate quickly on these products and features and learn new things daily — all while writing quality code. We work closely with product and design and solve some of the toughest engineering problems to unlock new possibilities for our customers. If you get excited by building products with high customer impact — this is the place for you. Responsibilities In this role, you will: Design and build the next generation of Warehouse Activation platform, process billions of events, and power various use cases for Twilio Segment customers. This encompasses working on stream data processing, storage, and other mission-critical systems Ship features that opt for high availability and throughput with eventual consistency Collaborate with engineering and product leads, as well as teams across Twilio Segment Support the reliability and security of the platform Build and optimize globally available and highly scalable distributed systems Mentor other engineers on the team in technical architecture and design Partner with application teams to deliver end to end customer success. Qualifications Twilio values diverse experiences from all kinds of industries, and we encourage everyone who meets the required qualifications to apply. If your career is just starting or hasn't followed a traditional path, don't let that stop you from considering Twilio. We are always looking for people who will bring something new to the table! Required 5+ years of experience writing production-grade code in a modern programming language Strong theoretical fundamentals and hands-on experience designing and implementing highly available and performant fault-tolerant distributed systems. Experience programming in one or more of the following: Go, Java, Scala, or similar languages Well-versed in concurrent programming, along with a solid grasp of Linux systems and networking concepts. Experience operating large-scale, distributed systems on top of cloud infrastructure such as Amazon Web Services (AWS) or Google CloudCompute Platform (GCP) Experience in message passing systems (e.g., Kafka, AWS Kinesis) and/or modern stream processing systems (e.g., Spark, Flink). Have hands-on experience with container orchestration frameworks (e.g. Kubernetes, EKS, ECS) Experience shipping services (products) following CI/CD development paradigm. Strong understanding of architectural patterns of high-scale web applications (e.g., well-designed APIs, high volume data pipelines, efficient algorithms) Ideally domain expertise in the Modern Data stack with experience in developing cloud-based data solution components and architecture covering data ingestion, data processing and data storage Have a track record of successfully delivering on large or complex projects delivered iteratively. Strong written and verbal technical communication skills to convey complex technical concepts effectively. Comfortable asking questions and taking initiative to solve problems where it is often necessary to “draw the owl” Location : This role will be remote and based in India(Karnataka, Maharashtra, New Delhi, Tamil nadu and Telangana) What We Offer There are many benefits to working at Twilio, including things like competitive pay, generous time-off, ample parental and wellness leave, healthcare, a retirement savings program, and much more. Offerings vary by location. Twilio thinks big. Do you? We like to solve problems, take initiative, pitch in when needed, and are always up for trying new things. That's why we seek out colleagues who embody our values — something we call Twilio Magic. Additionally, we empower employees to build positive change in their communities by supporting their volunteering and donation efforts. So, if you're ready to unleash your full potential, do your best work, and be the best version of yourself, apply now! If this role isn't what you're looking for, please consider other open positions. Twilio is proud to be an equal opportunity employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, reproductive health decisions, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, genetic information, political views or activity, or other applicable legally protected characteristics. We also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law. Qualified applicants with arrest or conviction records will be considered for employment in accordance with the Los Angeles County Fair Chance Ordinance for Employers and the California Fair Chance Act. Additionally, Twilio participates in the E-Verify program in certain locations, as required by law. Show more Show less
Posted 6 days ago
5.0 - 10.0 years
7 - 12 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Description: ACCOUNTABILITIES: Designs, codes, tests, debugs and documents software according to Dell s systems quality standards, policies and procedures. Analyzes business needs and creates software solutions. Responsible for preparing design documentation. Prepares test data for unit, string and parallel testing. Evaluates and recommends software and hardware solutions to meet user needs. Resolves customer issues with software solutions and responds to suggestions for improvements and enhancements. Works with business and development teams to clarify requirements to ensure testability. Drafts, revises, and maintains test plans, test cases, and automated test scripts. Executes test procedures according to software requirements specifications Logs defects and makes recommendations to address defects. Retests software corrections to ensure problems are resolved. Documents evolution of testing procedures for future replication. May conduct performance and scalability testing. RESPONSIBILITIES: Leads small to moderate budget projects; may perform in project leadership role and/or may supervise the activities of lower level personnel. Provides resolutions to a diverse range of complex problems. Executes schedules, costs and documentation to ensure assigned projects come to successful conclusion. May assist in training, assigning and checking the work of less experienced developers. Performs estimation efforts on projects and tracks progress. Drafts and revises test plans and scripts with consideration to end-to-end system flows. Executes test scripts according to application requirements documentation. Logs defects, identifies course of action and performs preliminary root cause analysis. Analyzes and communicates test results to project team. Description Comments Additional Details Description Comments : Skills: Python, PySpark and SQL 5 years of experience in Spark, Scala, PySpark for big data processing Proficiency in Python programming for data manipulation and analysis. Experience with Python libraries such as Pandas, NumPy. Knowledge of Spark architecture and components (RDDs, DataFrames, Spark SQL). Strong knowledge of SQL for querying databases. Experience with database systems like Lakehouse, PostgreSQL, Teradata, SQL Server. Ability to write complex SQL queries for data extraction and transformation. Strong analytical skills to interpret data and provide insights. Ability to troubleshoot and resolve data-related issues. Strong problem-solving skills to address data-related challenges Effective communication skills to collaborate with cross-functional teams.Role/Responsibilities: Work on development activities along with lead activities Coordinate with the Product Manager (PdM) and Development Architect (Dev Architect) and handle deliverables independently Collaborate with other teams to understand data requirements and deliver solutions. Design, develop, and maintain scalable data pipelines using Python and PySpark. Utilize PySpark and Spark scripting for data processing and analysis Implement ETL (Extract, Transform, Load) processes to ensure data is accurately processed and stored. Develop and maintain Power BI reports and dashboards. Optimize data pipelines for performance and reliability. Integrate data from various sources into centralized data repositories. Ensure data quality and consistency across different data sets. Analyze large data sets to identify trends, patterns, and insights. Optimize PySpark applications for better performance and scalability. Continuously improve data processing workflows and infrastructure. Not to Exceed Rate : (No Value)
Posted 6 days ago
4.0 - 8.0 years
6 - 10 Lacs
Hyderabad
Work from Office
Position Overview Evernorth Core Platform Engineering team is looking for a Software Engineering Senior A dvisor to design, develop, and implement robust data pipelines for Packaged Business Capabilities (PBCs) . In this role, you will act as a bridge between the software development team and the business, translating complex business requirements into actionable software solutions. Youll leverage your technical expertise and analytical skills to ensure that the software we build meets the needs of our users and the business. Job Description & Responsibilities: Analyze business needs and translate them into technical requirements. Design, develop, and implement software solutions using microservices architecture (AWS Lambda, Kubernetes). Write clean, efficient, and maintainable code using TypeScript and Golang. Collaborate with cross-functional teams to ensure successful project delivery. Stay up to date on the latest software development trends and technologies. Qualifications Required Skills : 11 - 13 years of experience in software engineering or a related field Experience working with cloud platforms, preferably AWS Proficiency in programming languages like Scala, Spark, Pyspark . Hands - on experience i n building big data pipelines. Familiarity with Open Search/Elastic Search Engine. Expertise with Databricks . Excellent analytical and problem-solving skills Strong communication and collaboration skills Required Experience & Education : A minimum of 4 years of experience in backend engineering Excellent communication and collaboration skills Desired Experience: Exposure to AWS , Databricks. Location & Hours of Work Full-time position, working 40 hours per week. Expected overlap with US hours as appropriate . Primarily based in the Innovation Hub in Hyderabad, India, with flexibility to work remotely as required .
Posted 6 days ago
5.0 - 8.0 years
7 - 10 Lacs
Mumbai, Nagpur, Thane
Work from Office
First HM needs strong java candidates (5-8 years exp) who can pick up Scala on the job Need profiles asap Requisition ID P tracker ID HM Location Skill Level YTR YTR Feng Chen Bangalore Strong Java/ Strong Java Scala Level 3 Second - Needs SQL/Python resources as described below- This requirement is been filled. Requisition ID P tracker ID HM Location Skill Level YTR YTR Akshay Deodhar Bangalore RDBMS Python Level 2 HM has updated the job description. Please note you can submit two different skill set profiles If you find all these skills set in one profile will be extraordinary. If not , Please Look for Java Scala or Strong Java Developer. & other profile of SQL /Python Developer. First HM needs strong java candidates (5-8 years exp) who can pick up Scala on the job. Second - Needs SQL/Python resources as described below- Sharing the JD: Exposure in RDBMS platform (writing SQLs, Stored procedure, Data warehousing concepts etc..) Hands-on in Python. Good to have big data exposure (Spark and Hadoop concepts) Good to have Azure cloud level exposure (Databricks or Snowflake). Overall job experience of 3-6 years should be fine.
Posted 6 days ago
5.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
What You’ll Do Design, develop, and operate high scale applications across the full engineering stack. Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Research, create, and develop software applications to extend and improve on Equifax Solutions. Manage sole project priorities, deadlines, and deliverables. Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What Experience You Need Bachelor's degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, Spring Framework, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs Big Data Technologies : Spark/Scala/Hadoop What could set you apart Experience designing and developing big data processing solutions using DataProc, Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others Cloud Certification especially in GCP Self-starter that identifies/responds to priority shifts with minimal supervision. You have excellent leadership and motivational skills You have an inquisitive and innovative mindset with a shown ability to recognize opportunities to create distinctive value You can successfully evaluate workload to drive efficiency Show more Show less
Posted 6 days ago
4.0 - 9.0 years
10 - 15 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
About Us: KPI Partners is a leading provider of data analytics and performance management solutions, dedicated to helping organizations harness the power of their data to drive business success. Our team of experts is at the forefront of the data revolution, delivering innovative solutions to our clients. We are currently seeking a talented and experienced Senior Developer / Lead Data Engineer with expertise in Incorta to join our dynamic team. Job Description: As a Senior Developer / Lead Data Engineer at KPI Partners, you will play a critical role in designing, developing, and implementing data solutions using Incorta. You will work closely with cross-functional teams to understand data requirements, build and optimize data pipelines, and ensure that our data integration processes are efficient and effective. This position requires strong analytical skills, proficiency in Incorta, and a passion for leveraging data to drive business insights. Key Responsibilities: - Design and develop scalable data integration solutions using Incorta. - Collaborate with business stakeholders to gather data requirements and translate them into technical specifications. - Create and optimize data pipelines to ensure high data quality and availability. - Perform data modeling, ETL processes, and data engineering activities to support analytics initiatives. - Troubleshoot and resolve data-related issues across various systems and environments. - Mentor and guide junior developers and data engineers, fostering a culture of learning and collaboration. - Stay updated on industry trends, best practices, and emerging technologies related to data engineering and analytics. - Work with the implementation team to ensure smooth deployment of solutions and provide ongoing support. Qualifications: - Bachelor's or Master's degree in Computer Science, Engineering, Information Systems, or a related field. - 5+ years of experience in data engineering or related roles with a strong focus on Incorta. - Expertise in Incorta and its features, along with experience in data modeling and ETL processes. - Proficiency in SQL and experience with relational databases (e.g., MySQL, Oracle, SQL Server). - Strong analytical and problem-solving skills, with the ability to work with complex data sets. - Excellent communication and collaboration skills to work effectively in a team-oriented environment. - Familiarity with cloud platforms (e.g., AWS, Azure) and data visualization tools is a plus. - Experience with programming languages such as Python, Java, or Scala is advantageous. Why Join KPI Partners? - Opportunity to work with a talented and passionate team in a fast-paced environment. - Competitive salary and benefits package. - Continuous learning and professional development opportunities. - A collaborative and inclusive workplace culture that values diversity and innovation. KPI Partners is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. Join us at KPI Partners and help us unlock the power of data for our clients!
Posted 6 days ago
2.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Description Amazon Music is awash in data! To help make sense of it all, the DISCO (Data, Insights, Science & Optimization) team: (i) enables the Consumer Product Tech org make data driven decisions that improve the customer retention, engagement and experience on Amazon Music. We build and maintain automated self-service data solutions, data science models and deep dive difficult questions that provide actionable insights. We also enable measurement, personalization and experimentation by operating key data programs ranging from attribution pipelines, northstar weblabs metrics to causal frameworks. (ii) delivering exceptional Analytics & Science infrastructure for DISCO teams, fostering a data-driven approach to insights and decision making. As platform builders, we are committed to constructing flexible, reliable, and scalable solutions to empower our customers. (iii) accelerates and facilitates content analytics and provides independence to generate valuable insights in a fast, agile, and accurate way. This domain provides analytical support for the below topics within Amazon Music: Programming / Label Relations / PR / Stations / Livesports / Originals / Case & CAM. DISCO team enables repeatable, easy, in depth analysis of music customer behaviors. We reduce the cost in time and effort of analysis, data set building, model building, and user segmentation. Our goal is to empower all teams at Amazon Music to make data driven decisions and effectively measure their results by providing high quality, high availability data, and democratized data access through self-service tools. If you love the challenges that come with big data then this role is for you. We collect billions of events a day, manage petabyte scale data on Redshift and S3, and develop data pipelines using Spark/Scala EMR, SQL based ETL, Airflow and Java services. We are looking for talented, enthusiastic, and detail-oriented Data Engineer, who knows how to take on big data challenges in an agile way. Duties include big data design and analysis, data modeling, and development, deployment, and operations of big data pipelines. You'll help build Amazon Music's most important data pipelines and data sets, and expand self-service data knowledge and capabilities through an Amazon Music data university. DISCO team develops data specifically for a set of key business domains like personalization and marketing and provides and protects a robust self-service core data experience for all internal customers. We deal in AWS technologies like Redshift, S3, EMR, EC2, DynamoDB, Kinesis Firehose, and Lambda. Your team will manage the data exchange store (Data Lake) and EMR/Spark processing layer using Airflow as orchestrator. You'll build our data university and partner with Product, Marketing, BI, and ML teams to build new behavioural events, pipelines, datasets, models, and reporting to support their initiatives. You'll also continue to develop big data pipelines. Key job responsibilities Deep understanding of data, analytical techniques, and how to connect insights to the business, and you have practical experience in insisting on highest standards on operations in ETL and big data pipelines. With our Amazon Music Unlimited and Prime Music services, and our top music provider spot on the Alexa platform, providing high quality, high availability data to our internal customers is critical to our customer experiences. Assist the DISCO team with management of our existing environment that consists of Redshift and SQL based pipelines. The activities around these systems will be well defined via standard operation procedures (SOP) and typically involve approving data access requests, subscribing or adding new data to the environment SQL data pipeline management (creating or updating existing pipelines) Perform maintenance tasks on the Redshift cluster. Assist the team with the management of our next-generation AWS infrastructure. Tasks includes infrastructure monitoring via CloudWatch alarms, infrastructure maintenance through code changes or enhancements, and troubleshooting/root cause analysis infrastructure issues that arise, and in some cases this resource may also be asked to submit code changes based on infrastructure issues that arise. About The Team Amazon Music is an immersive audio entertainment service that deepens connections between fans, artists, and creators.From personalized music playlists to exclusive podcasts,concert livestreams to artist merch,we are innovating at some of the most exciting intersections of music and culture.We offer experiences that serve all listeners with our different tiers of service:Prime members get access to all music in shuffle mode,and top ad-free podcasts,included with their membership;customers can upgrade to Music Unlimited for unlimited on-demand access to 100 million songs including millions in HD,Ultra HD,spatial audio and anyone can listen for free by downloading Amazon Music app or via Alexa-enabled devices.Join us for opportunity to influence how Amazon Music engages fans, artists,and creators on a global scale. Basic Qualifications 2+ years of data engineering experience Experience with data modeling, warehousing and building ETL pipelines Experience with SQL Experience with one or more scripting language (e.g., Python, KornShell) Experience in Unix Experience in Troubleshooting the issues related to Data and Infrastructure issues. Preferred Qualifications Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc. Knowledge of distributed systems as it pertains to data storage and computing Experience in building or administering reporting/analytics platforms Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI MAA 15 SEZ Job ID: A2838395 Show more Show less
Posted 6 days ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Description Digital and Emerging Markets Payments team is responsible for launching new payment experiences for digital businesses WW and retail business in emerging markets. We are a growing team adding new charters relevant to payment related customer experience for our customers in emerging markets. As part of this growth, we are hiring a Software Development Engineer II to contribute to the implementation of new payment methods and services to support international business regulations. In this position, you will contribute to the success of an international team that manages complex workflows, collaborates with internal and external partners, implements scalable large scale solutions, uses all the flavors of the JVM (Kotlin, Scala, Java), and leverage NAWS components to delight customers in Emerging Marketplaces. Key job responsibilities Key job responsibilities Solve complex architecture and business problems. Innovate to solve unique problems in simple yet elegant way. Solutions are extensible. Own the architecture of several components of the consumer payments tech stack Continuously working on improving the current limitations and compatibilities between subsystems, and on the development of major routines and utilities. Designing and building features with a strong mindset towards performances. Preparation of technical requirements and software design specifications. Instilling best practices for software development and documentation, making sure designs meet requirements, and delivering high quality software on tight schedules Take ownership for ensuring sanity of architecture, operational excellence and quality and insisting on highest standards while working with other software teams. Own the delivery of an integral piece of a system or application. Write high quality code that is modular, functional and testable; - Establish the best coding practices Communicate, collaborate and work effectively in a global environment unafraid to think out-of-the-box Assist directly and indirectly in the continual hiring and development of technical talent Basic Qualifications 3+ years of non-internship professional software development experience 2+ years of non-internship design or architecture (design patterns, reliability and scaling) of new and existing systems experience 3+ years of Video Games Industry (supporting title Development, Release, or Live Ops) experience Experience programming with at least one software programming language Preferred Qualifications 3+ years of full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations experience Bachelor's degree in computer science or equivalent Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - Tamil Nadu - A83 Job ID: A2969989 Show more Show less
Posted 6 days ago
10.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) – Cloud Architect As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for Managers (GTM +Cloud/ Big Data Architects) with strong technology and data understanding having proven delivery capability in delivery and pre sales. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Have proven experience in driving Analytics GTM/Pre-Sales by collaborating with senior stakeholder/s in the client and partner organization in BCM, WAM, Insurance. Activities will include pipeline building, RFP responses, creating new solutions and offerings, conducting workshops as well as managing in flight projects focused on cloud and big data. Need to work with client in converting business problems/challenges to technical solutions considering security, performance, scalability etc. [ 10- 15 years] Need to understand current & Future state enterprise architecture. Need to contribute in various technical streams during implementation of the project. Provide product and design level technical best practices Interact with senior client technology leaders, understand their business goals, create, architect, propose, develop and deliver technology solutions Define and develop client specific best practices around data management within a Hadoop environment or cloud environment Recommend design alternatives for data ingestion, processing and provisioning layers Design and develop data ingestion programs to process large data sets in Batch mode using HIVE, Pig and Sqoop, Spark Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Skills And Attributes For Success Architect in designing highly scalable solutions Azure, AWS and GCP. Strong understanding & familiarity with all Azure/AWS/GCP /Bigdata Ecosystem components Strong understanding of underlying Azure/AWS/GCP Architectural concepts and distributed computing paradigms Hands-on programming experience in Apache Spark using Python/Scala and Spark Streaming Hands on experience with major components like cloud ETLs,Spark, Databricks Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Knowledge of Spark and Kafka integration with multiple Spark jobs to consume messages from multiple Kafka partitions Solid understanding of ETL methodologies in a multi-tiered stack, integrating with Big Data systems like Cloudera and Databricks. Strong understanding of underlying Hadoop Architectural concepts and distributed computing paradigms Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Good knowledge in apache Kafka & Apache Flume Experience in Enterprise grade solution implementations. Experience in performance bench marking enterprise applications Experience in Data security [on the move, at rest] Strong UNIX operating system concepts and shell scripting knowledge To qualify for the role, you must have Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support Responsible for the evaluation of technical risks and map out mitigation strategies Experience in Data security [on the move, at rest] Experience in performance bench marking enterprise applications Working knowledge in any of the cloud platform, AWS or Azure or GCP Excellent business communication, Consulting, Quality process skills Excellent Consulting Skills Excellence in leading Solution Architecture, Design, Build and Execute for leading clients in Banking, Wealth Asset Management, or Insurance domain. Minimum 7 years hand-on experience in one or more of the above areas. Minimum 10 years industry experience Ideally, you’ll also have Strong project management skills Client management skills Solutioning skills What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 6 days ago
4.0 - 9.0 years
15 - 30 Lacs
Bangalore/Bengaluru, Mumbai (All Areas)
Hybrid
Job Title: Java Backend & Java Full Stack Developer Location: Mumbai, Bangalore Experience Range: 3 to 15 Years Notice Period: Immediate to 30 Days About the Project: We are looking for an experienced, innovative, and highly motivated Web Developer to help design and develop the next generation of Technology & Operations Management Systems Applications. Our platform supports Technology, Operations, and Finance divisions, enabling them to operate efficiently and manage over $4 billion in annual technology spend. Success in this position requires a solid foundation across the complete spectrum of web development and best practices, coupled with strong interpersonal skills. Key Responsibilities: Contribute to large-scale strategic planning and development. Design, code, test, debug, and document projects related to the technology domain, including upgrades and deployments. Review and resolve moderately complex technical challenges requiring an in-depth evaluation of technologies and procedures. Lead a team to meet existing and potential client needs, leveraging a solid understanding of function, policies, procedures, and compliance requirements. Collaborate with peers, colleagues, and mid-level managers to resolve technical challenges and achieve business goals. Provide guidance and direction to less experienced staff, acting as an escalation point. Design robust and scalable solutions to support enterprise applications. Develop APIs using REST, MQ, Kafka, and other standard channels. Must-Have Skills: Strong experience with the Java and J2EE platforms and frameworks: Java 8, Spring Boot, Spring Framework, REST, Web Services, Tomcat, JBoss. Experience in Java 8 or Java 11. For Full Stack Role: 2+ years of experience with ReactJS/Angular 8+. Experience with databases like NoSQL or MySQL (e.g., MongoDB, PostgreSQL, DB2, Sybase). In-depth knowledge of design patterns. Strong understanding of Agile methodologies and Test-Driven Development (TDD) with a track record of high-quality deliverables. Solid understanding of testing technologies, both manual and automation. Good-to-Have Skills: Familiarity with Unix scripting, performance monitoring, and load testing tools. Knowledge of Kafka, Hadoop, and Scala. Experience with frontend technologies like Angular 8 & above
Posted 6 days ago
10.0 years
0 Lacs
India
Remote
Staff Software Engineer - QE Location - India, Remote Sumo Logic is a cloud-native SaaS data analytics platform, that solves complex observability and security problems. Customers choose our product because it allows them to easily monitor, optimize, and secure their applications, systems, and infrastructures. Our microservices architecture hosted on AWS ingests petabytes of data daily across many geographic regions. Millions of queries a day analyze hundreds of petabytes of data. What can you expect to do? You will have ownership of creating and executing test plans and developing test strategies for critical system components and be responsible for analyzing test coverage, creating test cases, and coordinating review and feedback from the cross-functional team. You will help bridge the gap between development and quality assurance. You will address complex problems with innovative solutions, iterate on designs, and mentor team members to promote technical growth and excellence. Our system is a highly distributed, fault-tolerant, multi-tenant platform that includes bleeding-edge components related to storage, messaging, search, and analytics. This system ingests and analyzes terabytes of data a day, while making petabytes of data available for search and forensic analysis, and is expected to reach a substantially larger scale in the near future. Role And Responsibilities Collaborate with cross-functional teams to understand project requirements and specifications. Develop and execute test cases, scripts, plans, and procedures (manual and automated) to ensure the highest quality software delivery. Report project status defects report and verification, and issue escalation in a timely manner. Participate in design and specification reviews, providing valuable input from a testing perspective. Improve design specifications and writes elegant code that meets the Sumo Logic standard. Solves complex problems by iterating, redesigning, and innovating systems. Guide and mentor junior team members, sharing knowledge and fostering technical growth within the team. Estimate and perform risk analysis on large features during sprint planning meetings Continuously improve testing processes by staying updated on industry best practices and new technologies. Promote the adoption of innovative tools and techniques within the team. Communicate effectively with development and product teams to resolve issues and ensure timely delivery of high-quality software. Requirements Hold a Bachelor's or Master's degree in Computer Science or a related field. Possess over 10+ years of testing experience. Demonstrate a robust grasp of the software development life cycle and testing methodologies. Have hands-on experience with Enterprise-grade SaaS products. Strong problem-solving skills and a proven track record of solving complex technical challenges. Familiarity with Continuous Integration or Continuous Deployment is a valuable addition. Exhibit proficiency in object-oriented languages such as Jave, Python, Scala or GO. Work effectively with both Unix and Windows operating systems. Approach testing with a proactive "break it" mentality. Familiar with popular testing tools like TestRail, Jira, Postman, JMeter, Selenium, etc. Display enthusiasm for staying updated on cutting-edge technologies, solving complex problems, and embracing challenges. Possess the ability to comprehend the Sumo Logic backend architecture and communicate with clarity and precision, both verbally and in writing. Desirable Have hands-on experience in testing large-scale systems. Desirable experience includes working with big data and/or 24x7 commercial services. Proficiency and comfort in working with Unix, including Linux and OS X. A plus if you bring experience in Agile software development, including test-driven development and iterative and incremental development methodologies. About Us Sumo Logic, Inc. empowers the people who power modern, digital business. Sumo Logic enables customers to deliver reliable and secure cloud-native applications through its Sumo Logic SaaS Analytics Log Platform, which helps practitioners and developers ensure application reliability, secure and protect against modern security threats, and gain insights into their cloud infrastructures. Customers worldwide rely on Sumo Logic to get powerful real-time analytics and insights across observability and security solutions for their cloud-native applications. For more information, visit www.sumologic.com. Sumo Logic Privacy Policy Show more Show less
Posted 6 days ago
5.0 years
0 Lacs
India
On-site
Currently we have an open position with our client - Its a IT Consulting Firm Principal Databricks Engineer/Architect. Key Responsibilities: 1. Databricks Solution Architecture: Design and implement scalable, secure, and efficient Databricks solutions that meet client requirements. 2. Data Engineering: Develop data pipelines, architect data lakes, and implement data warehousing solutions using Databricks. 3. Data Analytics: Collaborate with data scientists and analysts to develop and deploy machine learning models and analytics solutions on Databricks. 4. Performance Optimization: Optimize Databricks cluster performance, ensuring efficient resource utilization and cost-effectiveness. 5. Security and Governance: Implement Databricks security features, ensure data governance, and maintain compliance with industry regulations. 6. Client Engagement: Work closely with clients to understand their business requirements, provide technical guidance, and deliver high-quality Databricks solutions. 7. Thought Leadership: Stay up-to-date with the latest Databricks features, best practices, and industry trends, and share knowledge with the team. Requirements: 1. Databricks Experience: 5+ years of experience working with Databricks, including platform architecture, data engineering, and data analytics. 2. Technical Skills: Proficiency in languages such as Python, Scala, or Java, and experience with Databricks APIs, Spark, and Delta Lake. 3. Data Engineering: Strong background in data engineering, including data warehousing, ETL, and data governance. 4. Leadership: Proven experience leading technical teams, mentoring junior engineers, and driving technical initiatives. 5. Communication: Excellent communication and interpersonal skills, with the ability to work effectively with clients and internal stakeholders. Good to Have: 1. Certifications: Databricks Certified Professional or similar certifications. 2. Cloud Experience: Experience working with cloud platforms such as AWS, Azure, or GCP. 3. Machine Learning: Knowledge of machine learning concepts and experience with popular ML libraries. Show more Show less
Posted 6 days ago
10.0 years
0 Lacs
Kanayannur, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) – Cloud Architect As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for Managers (GTM +Cloud/ Big Data Architects) with strong technology and data understanding having proven delivery capability in delivery and pre sales. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Have proven experience in driving Analytics GTM/Pre-Sales by collaborating with senior stakeholder/s in the client and partner organization in BCM, WAM, Insurance. Activities will include pipeline building, RFP responses, creating new solutions and offerings, conducting workshops as well as managing in flight projects focused on cloud and big data. Need to work with client in converting business problems/challenges to technical solutions considering security, performance, scalability etc. [ 10- 15 years] Need to understand current & Future state enterprise architecture. Need to contribute in various technical streams during implementation of the project. Provide product and design level technical best practices Interact with senior client technology leaders, understand their business goals, create, architect, propose, develop and deliver technology solutions Define and develop client specific best practices around data management within a Hadoop environment or cloud environment Recommend design alternatives for data ingestion, processing and provisioning layers Design and develop data ingestion programs to process large data sets in Batch mode using HIVE, Pig and Sqoop, Spark Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Skills And Attributes For Success Architect in designing highly scalable solutions Azure, AWS and GCP. Strong understanding & familiarity with all Azure/AWS/GCP /Bigdata Ecosystem components Strong understanding of underlying Azure/AWS/GCP Architectural concepts and distributed computing paradigms Hands-on programming experience in Apache Spark using Python/Scala and Spark Streaming Hands on experience with major components like cloud ETLs,Spark, Databricks Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Knowledge of Spark and Kafka integration with multiple Spark jobs to consume messages from multiple Kafka partitions Solid understanding of ETL methodologies in a multi-tiered stack, integrating with Big Data systems like Cloudera and Databricks. Strong understanding of underlying Hadoop Architectural concepts and distributed computing paradigms Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Good knowledge in apache Kafka & Apache Flume Experience in Enterprise grade solution implementations. Experience in performance bench marking enterprise applications Experience in Data security [on the move, at rest] Strong UNIX operating system concepts and shell scripting knowledge To qualify for the role, you must have Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support Responsible for the evaluation of technical risks and map out mitigation strategies Experience in Data security [on the move, at rest] Experience in performance bench marking enterprise applications Working knowledge in any of the cloud platform, AWS or Azure or GCP Excellent business communication, Consulting, Quality process skills Excellent Consulting Skills Excellence in leading Solution Architecture, Design, Build and Execute for leading clients in Banking, Wealth Asset Management, or Insurance domain. Minimum 7 years hand-on experience in one or more of the above areas. Minimum 10 years industry experience Ideally, you’ll also have Strong project management skills Client management skills Solutioning skills What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 6 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Scala is a popular programming language that is widely used in India, especially in the tech industry. Job seekers looking for opportunities in Scala can find a variety of roles across different cities in the country. In this article, we will dive into the Scala job market in India and provide valuable insights for job seekers.
These cities are known for their thriving tech ecosystem and have a high demand for Scala professionals.
The salary range for Scala professionals in India varies based on experience levels. Entry-level Scala developers can expect to earn around INR 6-8 lakhs per annum, while experienced professionals with 5+ years of experience can earn upwards of INR 15 lakhs per annum.
In the Scala job market, a typical career path may look like: - Junior Developer - Scala Developer - Senior Developer - Tech Lead
As professionals gain more experience and expertise in Scala, they can progress to higher roles with increased responsibilities.
In addition to Scala expertise, employers often look for candidates with the following skills: - Java - Spark - Akka - Play Framework - Functional programming concepts
Having a good understanding of these related skills can enhance a candidate's profile and increase their chances of landing a Scala job.
Here are 25 interview questions that you may encounter when applying for Scala roles:
As you explore Scala jobs in India, remember to showcase your expertise in Scala and related skills during interviews. Prepare well, stay confident, and you'll be on your way to a successful career in Scala. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.