Home
Jobs

3773 Scala Jobs - Page 40

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Role: Data Engineer (Scala) Must Have Experience: 5+yrs Overall Exp, 3+ yrs Relevant Exp Must Have skills: Spark, SQL, Scala Spark, SQL, Pyspark Good To have : AWS, EMR, S3, Hadoop, Ctrl M. Key responsibilities (please specify if the position is an individual one or part of a team): 1) Should be able to design strategies and programs to collect, store, analyse and visualize data from various sources. 2) Should be able to develop big data solution recommendations and ensure implementation of the chosen big data solution. 3) Needs to be able to program, preferably in different programming/scripting languages such as Scala, Python, Java, Pig or SQL. 4) Proficient knowledge in Big data frameworks Spark, Map Reduce, 5) Should have an understanding of Hadoop, Hive, HBase, MongoDB and/or MapReduce. 6) Should also have experience with one of the large cloud-computing infrastructure solutions like Amazon Web Services or Elastic MapReduce. 7) Tuning the Spark Engine for high volume of data ( approx billion records) processing using BDM. 8) Troubleshoot data issues, deep dive into root cause analysis of any performance issue. Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

We are looking for energetic, high-performing and highly skilled Java + Big Data Engineers to help shape our technology and product roadmap. You will be part of the fast-paced, entrepreneurial Enterprise Personalization portfolio focused on delivering the next generation global marketing capabilities. This team is responsible for building products that power Merchant Offers personalization for Amex card members. Job Description: - Demonstrated leadership in designing sustainable software products, setting development standards, automated code review process, continuous build and rigorous testing etc - Ability to effectively lead and communicate across 3rd parties, technical and business product managers on solution design - Primary focus is spent writing code, API specs, conducting code reviews & testing in ongoing sprints or doing proof of concepts/automation tools - Applies visualization and other techniques to fast track concepts - Functions as a core member of an Agile team driving User story analysis & elaboration, design and development of software applications, testing & builds automation tools - Works on a specific platform/product or as part of a dynamic resource pool assigned to projects based on demand and business priority - Identifies opportunities to adopt innovative technologies Qualification: - Bachelor's degree in computer science, computer engineering, other technical discipline, or equivalent work experience - 5+ years of software development experience - 3-5 years of experience leading teams of engineers - Demonstrated experience with Agile or other rapid application development methods - Demonstrated experience with object-oriented design and coding - Demonstrated experience on these core technical skills (Mandatory) - Core Java, Spring Framework, Java EE - Hadoop Ecosystem (HBase, Hive, MapReduce, HDFS, Pig, Sqoop etc) - Spark - Relational Database (PostGreS / MySQL / DB2 etc) - Data Serialization techniques (Avro) - Cloud development (Micro-services) - Parallel & distributed (multi-tiered) systems - Application design, software development and automated testing - Demonstrated experience on these additional technical skills (Nice to Have) - Unix / Shell scripting - Python / Scala - Message Queuing, Stream processing (Kafka) - Elastic Search - AJAX tools/ Frameworks. - Web services , open API development, and REST concepts - Experience with implementing integrated automated release management using tools/technologies/frameworks like Maven, Git, code/security review tools, Jenkins, Automated testing and Junit. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Role: Data Engineer (Scala) Must Have Experience: 5+yrs Overall Exp, 3+ yrs Relevant Exp Must Have skills: Spark, SQL, Scala Spark, SQL, Pyspark Good To have : AWS, EMR, S3, Hadoop, Ctrl M. Key responsibilities (please specify if the position is an individual one or part of a team): 1) Should be able to design strategies and programs to collect, store, analyse and visualize data from various sources. 2) Should be able to develop big data solution recommendations and ensure implementation of the chosen big data solution. 3) Needs to be able to program, preferably in different programming/scripting languages such as Scala, Python, Java, Pig or SQL. 4) Proficient knowledge in Big data frameworks Spark, Map Reduce, 5) Should have an understanding of Hadoop, Hive, HBase, MongoDB and/or MapReduce. 6) Should also have experience with one of the large cloud-computing infrastructure solutions like Amazon Web Services or Elastic MapReduce. 7) Tuning the Spark Engine for high volume of data ( approx billion records) processing using BDM. 8) Troubleshoot data issues, deep dive into root cause analysis of any performance issue. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Note: By applying to this position you will have an opportunity to share your preferred working location from the following: Gurugram, Haryana, India; Bengaluru, Karnataka, India; Hyderabad, Telangana, India . Minimum qualifications: Bachelor's degree in Computer Science, Mathematics, a related technical field, or equivalent practical experience. Experience building Machine Learning or Data Science solutions. Experience writing software in Python, Scala, R, or similar. Experience with data structures, algorithms, and software design. Ability to travel up to 30% of the time. Preferred qualifications: Experience working with recommendation engines, data pipelines, or distributed machine learning, data analytics, data visualization techniques and software, and deep learning frameworks. Experience in software development, professional services, solution engineering, technical consulting, architecting and rolling out new technology and solution initiatives. Experience with core Data Science techniques. Knowledge of data warehousing concepts, including data warehouse technical architectures, infrastructure components, ETL/ELT and reporting/analytic tools and environments. Knowledge of cloud computing, including virtualization, hosted services, multi-tenant cloud infrastructures, storage systems, and content delivery networks. Excellent customer-facing communication and listening skills. About The Job The Google Cloud Platform team helps customers transform and build what's next for their business, all with technology built in the cloud. Our products are developed for security, reliability and scalability, running the full stack from infrastructure to applications to devices and hardware. Our teams are dedicated to helping our customers, developers, small and large businesses, educational institutions and government agencies see the benefits of our technology come to life. As part of an entrepreneurial team in this rapidly growing business, you will play a key role in understanding the needs of our customers and help shape the future of businesses of all sizes use technology to connect with customers, employees and partners. As a Cloud Engineer, you will play a key role in ensuring that customers have the best experience moving to the Google Cloud machine learning (ML) suite of products. You will design and implement machine learning solutions for customer use cases, leveraging core Google products. You will work with customers to identify opportunities to transform their business with machine learning, and will travel to customer sites to deploy solutions and deliver workshops designed to educate and empower customers to realize the full potential of Google Cloud. You will have access to Google’s technology to monitor application performance, debug and troubleshoot product code, and address customer and partner needs. In this role, you will lead the timely execution of adopting the Google Cloud Platform solutions to the customer’s requirements. Google Cloud accelerates every organization’s ability to digitally transform its business and industry. We deliver enterprise-grade solutions that leverage Google’s cutting-edge technology, and tools that help developers build more sustainably. Customers in more than 200 countries and territories turn to Google Cloud as their trusted partner to enable growth and solve their most critical business problems. Responsibilities Deliver effective big data and machine learning solutions and solve technical customer issues. Act as a technical advisor to Google’s customers. Identify new product features and feature gaps, provide guidance on existing product issues, and collaborate with Product Managers and Engineers to influence the roadmap of Google Cloud Platform. Deliver best practice recommendations, tutorials, blog articles, and technical presentations adapting to different levels of key business and technical stakeholders. Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See also Google's EEO Policy and EEO is the Law. If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form . Show more Show less

Posted 1 week ago

Apply

4.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Our Company Changing the world through digital experiences is what Adobe’s all about. We give everyone—from emerging artists to global brands—everything they need to design and deliver exceptional digital experiences! We’re passionate about empowering people to create beautiful and powerful images, videos, and apps, and transform how companies interact with customers across every screen. We’re on a mission to hire the very best and are committed to creating exceptional employee experiences where everyone is respected and has access to equal opportunity. We realize that new ideas can come from everywhere in the organization, and we know the next big idea could be yours! What you will need to succeed 4+ years in design and development of large-scale data-driven systems Work experience on open-source technologies such as Apache Spark, Hadoop Stack, Kafka, Druid, etc. Work experience with NoSQL (Cassandra/HBase/Aerospike) and RDBMS systems Great problem-solving, coding (in Java/Scala, etc.) and system design skills Proficiency in data structures and algorithms Cost consciousness around computation and memory requirements Strong verbal and written communication skills BTech/MTech/MS in Computer Science What you'll do Participation in technical design along with implementation strategy for major systems & components of AdCloud Design, build and deploy the products with very high quality Bring more innovation in the current system to bring more robustness, ease and convenience Ability to articulate the design and code choices to cross-functional teams Reviewing and providing feedback on features, technology, architecture, design, time & budget estimates, and test strategies Collaborate with other teams to achieve common goals Adobe is proud to be an Equal Employment Opportunity employer. We do not discriminate based on gender, race or color, ethnicity or national origin, age, disability, religion, sexual orientation, gender identity or expression, veteran status, or any other applicable characteristics protected by law. Learn more. Adobe aims to make Adobe.com accessible to any and all users. If you have a disability or special need that requires accommodation to navigate our website or complete the application process, email accommodations@adobe.com or call (408) 536-3015. Show more Show less

Posted 1 week ago

Apply

4.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Who we are About Stripe Stripe is a financial infrastructure platform for businesses. Millions of companies - from the world’s largest enterprises to the most ambitious startups - use Stripe to accept payments, grow their revenue, and accelerate new business opportunities. Our mission is to increase the GDP of the internet, and we have a staggering amount of work ahead. That means you have an unprecedented opportunity to put the global economy within everyone's reach while doing the most important work of your career. About The Team Our Revenue and Finance Automation (RFA) suite gives businesses power over the entire life cycle of their cash flow. By coordinating billing, tax, reporting, and data services in one modern stack, Stripe’s revenue and finance automation suite eliminates the inefficiencies of legacy finance tools and supports revenue growth. The User Accounting Platform team is building an enterprise grade accounting ledger for our users to facilitate recording of all financial activities on Stripe. The team focuses on building a high volume, fully configurable ledger that enables core accounting functionalities like Revenue Recognition & Reconciliation, thereby, enabling continuous accounting and timely book closure for Stripe merchants. What you’ll do As a full stack engineer, you will design and build platforms, and system solutions that are configurable and scalable around the globe. You will partner with many functions at Stripe, with the opportunity to both work on financial platform systems, as well as direct user-facing business impact. Responsibilities Design, build, and maintain APIs, services, and systems across Stripe’s engineering teams. Partner with the Revenue Recognition, Revenue Reporting and Reconciliation Product teams to understand their unique requirements related to data and reporting, and provide tailored technical support accordingly. Work with engineers across the company to build new features at large-scale. Maintain a collaborative environment, engaging in discussions and decision-making processes with stakeholders within various domains at Stripe. Who you are We're looking for someone who meets the minimum requirements to be considered for the role. If you meet these requirements, you are encouraged to apply. The preferred qualifications are a bonus, not a requirement. Minimum Requirements 4+ years of experience in delivering, extending, and maintaining large scale distributed systems. Love to design systems that are elegant abstractions over complex patterns/practices, especially in the financial industry. Hold yourself and others to a high bar when working with production systems. Take pride in working on projects to successful completion involving a wide variety of technologies and systems. Think about systems, services, and platforms, and write high quality code. We work mostly in Java, Scala, and Ruby. However, languages can be learned: we care much more about your general engineering skill than knowledge of a particular language or framework. You have great product taste and a track record of taking complex problems and solving them elegantly. You are capable of working in ambiguous fast-moving environments and have a curiosity to learn the domain to a deep level. Enjoy working with a diverse group of people with different expertise. Preferred Qualifications Familiarity with large scale distributed systems. Experience working in high-growth teams similar to Stripe. If you meet the minimum requirements, we encourage you to apply. Preferred qualifications are beneficial but not mandatory. In-office expectations Office-assigned Stripes in most of our locations are currently expected to spend at least 50% of the time in a given month in their local office or with users. This expectation may vary depending on role, team and location. For example, Stripes in Stripe Delivery Center roles in Mexico City, Mexico and Bengaluru, India work 100% from the office. Also, some teams have greater in-office attendance requirements, to appropriately support our users and workflows, which the hiring manager will discuss. This approach helps strike a balance between bringing people together for in-person collaboration and learning from each other, while supporting flexibility when possible. Pay and benefits Stripe does not yet include pay ranges in job postings in every country. Stripe strongly values pay transparency and is working toward pay transparency globally. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Note: By applying to this position you will have an opportunity to share your preferred working location from the following: Gurugram, Haryana, India; Bengaluru, Karnataka, India; Hyderabad, Telangana, India . Minimum qualifications: Bachelor's degree in Computer Science, Mathematics, a related technical field, or equivalent practical experience. Experience building Machine Learning or Data Science solutions. Experience writing software in Python, Scala, R, or similar. Experience with data structures, algorithms, and software design. Ability to travel up to 30% of the time. Preferred qualifications: Experience working with recommendation engines, data pipelines, or distributed machine learning, data analytics, data visualization techniques and software, and deep learning frameworks. Experience in software development, professional services, solution engineering, technical consulting, architecting and rolling out new technology and solution initiatives. Experience with core Data Science techniques. Knowledge of data warehousing concepts, including data warehouse technical architectures, infrastructure components, ETL/ELT and reporting/analytic tools and environments. Knowledge of cloud computing, including virtualization, hosted services, multi-tenant cloud infrastructures, storage systems, and content delivery networks. Excellent customer-facing communication and listening skills. About The Job The Google Cloud Platform team helps customers transform and build what's next for their business, all with technology built in the cloud. Our products are developed for security, reliability and scalability, running the full stack from infrastructure to applications to devices and hardware. Our teams are dedicated to helping our customers, developers, small and large businesses, educational institutions and government agencies see the benefits of our technology come to life. As part of an entrepreneurial team in this rapidly growing business, you will play a key role in understanding the needs of our customers and help shape the future of businesses of all sizes use technology to connect with customers, employees and partners. As a Cloud Engineer, you will play a key role in ensuring that customers have the best experience moving to the Google Cloud machine learning (ML) suite of products. You will design and implement machine learning solutions for customer use cases, leveraging core Google products. You will work with customers to identify opportunities to transform their business with machine learning, and will travel to customer sites to deploy solutions and deliver workshops designed to educate and empower customers to realize the full potential of Google Cloud. You will have access to Google’s technology to monitor application performance, debug and troubleshoot product code, and address customer and partner needs. In this role, you will lead the timely execution of adopting the Google Cloud Platform solutions to the customer’s requirements. Google Cloud accelerates every organization’s ability to digitally transform its business and industry. We deliver enterprise-grade solutions that leverage Google’s cutting-edge technology, and tools that help developers build more sustainably. Customers in more than 200 countries and territories turn to Google Cloud as their trusted partner to enable growth and solve their most critical business problems. Responsibilities Deliver effective big data and machine learning solutions and solve technical customer issues. Act as a technical advisor to Google’s customers. Identify new product features and feature gaps, provide guidance on existing product issues, and collaborate with Product Managers and Engineers to influence the roadmap of Google Cloud Platform. Deliver best practice recommendations, tutorials, blog articles, and technical presentations adapting to different levels of key business and technical stakeholders. Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See also Google's EEO Policy and EEO is the Law. If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form . Show more Show less

Posted 1 week ago

Apply

6.0 - 11.0 years

10 - 20 Lacs

Visakhapatnam, Hyderabad, Bengaluru

Work from Office

Naukri logo

Should have working experience in Spark/Scala , AWS ,Bigdata Environments Hadoop, Hive, Sqoop,Python Scripting or Java Programming (Nice to Have) and willing to relocate Hyderabad

Posted 1 week ago

Apply

6.0 - 10.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Linkedin logo

Position- Scala Developer Experience- 6-10 Years Location-Pune/Bangalore Work Mode- Hybrid (2 days WFO, 3 days WFH) Working experience on Akka OR Peko ,Python OR Java Core Scala Mandate Must have working experience on Backend : Scala Must have working experience on Database : NoSQL DB (Elastic Search, Cosmos DB, Redis, Mongo DB or any other NoSQL DB) Good to have experience on Microsoft Azure Services (Azure Storage Account, Kubernetes,Bicep/ARM,EventHub,Event Grid, Message Bus), Terraform, Argo Must have experience on Build and Deployment: Kubernetes, Containers/Docker Must have experience on Tools: Azure DevOps, GitHub, Confluence Excellent verbal and written communication skills Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology Your Role And Responsibilities As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your Primary Responsibilities Include Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too Preferred Education Master's Degree Required Technical And Professional Expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scala ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Developed Python and pyspark programs for data analysis.. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations Preferred Technical And Professional Experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Note: By applying to this position you will have an opportunity to share your preferred working location from the following: Gurugram, Haryana, India; Bengaluru, Karnataka, India; Hyderabad, Telangana, India . Minimum qualifications: Bachelor's degree in Computer Science, Mathematics, a related technical field, or equivalent practical experience. Experience building Machine Learning or Data Science solutions. Experience writing software in Python, Scala, R, or similar. Experience with data structures, algorithms, and software design. Ability to travel up to 30% of the time. Preferred qualifications: Experience working with recommendation engines, data pipelines, or distributed machine learning, data analytics, data visualization techniques and software, and deep learning frameworks. Experience in software development, professional services, solution engineering, technical consulting, architecting and rolling out new technology and solution initiatives. Experience with core Data Science techniques. Knowledge of data warehousing concepts, including data warehouse technical architectures, infrastructure components, ETL/ELT and reporting/analytic tools and environments. Knowledge of cloud computing, including virtualization, hosted services, multi-tenant cloud infrastructures, storage systems, and content delivery networks. Excellent customer-facing communication and listening skills. About The Job The Google Cloud Platform team helps customers transform and build what's next for their business, all with technology built in the cloud. Our products are developed for security, reliability and scalability, running the full stack from infrastructure to applications to devices and hardware. Our teams are dedicated to helping our customers, developers, small and large businesses, educational institutions and government agencies see the benefits of our technology come to life. As part of an entrepreneurial team in this rapidly growing business, you will play a key role in understanding the needs of our customers and help shape the future of businesses of all sizes use technology to connect with customers, employees and partners. As a Cloud Engineer, you will play a key role in ensuring that customers have the best experience moving to the Google Cloud machine learning (ML) suite of products. You will design and implement machine learning solutions for customer use cases, leveraging core Google products. You will work with customers to identify opportunities to transform their business with machine learning, and will travel to customer sites to deploy solutions and deliver workshops designed to educate and empower customers to realize the full potential of Google Cloud. You will have access to Google’s technology to monitor application performance, debug and troubleshoot product code, and address customer and partner needs. In this role, you will lead the timely execution of adopting the Google Cloud Platform solutions to the customer’s requirements. Google Cloud accelerates every organization’s ability to digitally transform its business and industry. We deliver enterprise-grade solutions that leverage Google’s cutting-edge technology, and tools that help developers build more sustainably. Customers in more than 200 countries and territories turn to Google Cloud as their trusted partner to enable growth and solve their most critical business problems. Responsibilities Deliver effective big data and machine learning solutions and solve technical customer issues. Act as a technical advisor to Google’s customers. Identify new product features and feature gaps, provide guidance on existing product issues, and collaborate with Product Managers and Engineers to influence the roadmap of Google Cloud Platform. Deliver best practice recommendations, tutorials, blog articles, and technical presentations adapting to different levels of key business and technical stakeholders. Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See also Google's EEO Policy and EEO is the Law. If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form . Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

The Applications Development Senior Programmer Analyst is an intermediate level position responsible for participation in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to contribute to applications systems analysis and programming activities. Key Responsibilities: Design, develop, and optimize large-scale data pipelines and workflows using Big Data technologies such as Hadoop, Hive, Impala, Spark, and PySpark. Build and maintain data integration solutions to process structured and unstructured data from various sources. Implement and manage CI/CD pipelines to automate deployment and testing of data engineering solutions. Work with relational databases like Oracle to design and optimize data storage and retrieval. Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions. Ensure data quality, security, and governance across all data engineering processes. Monitor and troubleshoot performance issues in data pipelines and systems. Stay updated with the latest trends and advancements in Big Data and data engineering technologies. Required Skills and Qualifications: Proven experience in Big Data technologies: Hadoop, Hive, Impala, Spark, and PySpark. Strong programming skills in Python, Java, or Scala. Hands-on experience with CI/CD tools like Jenkins, Git, or similar. Proficiency in working with relational databases, especially Oracle. Solid understanding of data modeling, ETL processes, and data warehousing concepts. Experience with cloud platforms (e.g., AWS, Azure, or GCP) is a plus. Strong problem-solving skills and ability to work in a fast-paced environment. Excellent communication and collaboration skills. Education: Bachelor’s degree/University degree or equivalent experience This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less

Posted 1 week ago

Apply

12.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Description Context : In modern banking age financial institutions need to bring Classical Data Drivers and Evolving Business Drivers together in a single platform. These drivers also need to communicate with each other and share the data products for enterprise consumption. Traditional data platforms are able to handle classical data drivers well but fail to communicate with evolving business drivers due to limitations of technologies and implementation approaches. Modern Data Platform helps to fill this gap, and take the business to the next level of growth and expansion using data driven approaches. The technology transformation in recent years make such implementations feasible. Your Opportunity You will be responsible for leading the Modern Data Platform Practice, that would involve providing solutions to customers on Tradition Datawarehouses. Platforms on Prem and Cloud. It would cover aspects of Architecting the Data Platforms, defining Data engineering design, choosing appropriate technology and tools across on-Prem and Cloud services. Help the organization to strengthen the Modern Data Platform capabilities, lead the Pre-sales discussion on data platforms, provide the technology architecture in the RFP responses and lead the technology POC/MVP. Your Qualifications: We expect you to have following qualifications and experiences to be able to effectively perform the suggested role: A Technology leader with an engineering academic background in Computer Science / Information Technology / Data Technologies [BE/BTech/MCA] Overall 12-16 years of Data Engineering and analytics experience as individual contributor as well as Technology / Architecture lead A minimum of 5-7 years of hands-on experience in Big Data systems across On-Prem and Cloud environments Should have led Data Platform architecture & design projects for a mid to large size firms Have experience of implementing Batch Data and Streaming / Online data integrations using 3rd party tools and custom programs A good hands-on experience on SQL and one of the programming language: Core Java / Scala / Python. A good hands-on experience in Kafka for enabling Event driven data pipes / processing Knowledge of leading Data Sevices offered by AWS, Azure, Snowflake, Confluent Thorough understanding on distributed computing and related data structures Should have implemented Data Governance and Quality capabilities for a Data Platform (for On-Prem and or Cloud ) Good analytical skills and presentation skills Experience in building the team from the ground up Good exposure of leading RDBMS technologies and Data Visualization platforms Should have demonstrated AI/ML models for Data Processing and generating Insights for end users Good great teammate and ability to work on own initiatives with minimal direction Career Level - IC4 Responsibilities Diversity and Inclusion: An Oracle career can span industries, roles, Countries and cultures, giving you the opportunity to flourish in new roles and innovate, while blending work life in. Oracle has thrived through 40+ years of change by innovating and operating with integrity while delivering for the top companies in almost every industry. In order to nurture the talent that makes this happen, we are committed to an inclusive culture that celebrates and values diverse insights and perspectives, a workforce that inspires thought leadership and innovation. Oracle offers a highly competitive suite of Employee Benefits designed on the principles of parity, consistency, and affordability. The overall package includes certain core elements such as Medical, Life Insurance, access to Retirement Planning, and much more. We also encourage our employees to engage in the culture of giving back to the communities where we live and do business. At Oracle, we believe that innovation starts with diversity and inclusion and to create the future we need talent from various backgrounds, perspectives, and abilities. We ensure that individuals with disabilities are provided reasonable accommodation to successfully participate in the job application, interview process, and in potential roles. to perform crucial job functions. That’s why we’re committed to creating a workforce where all individuals can do their best work. It’s when everyone’s voice is heard and valued that we’re inspired to go beyond what’s been done before. About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

Remote

Linkedin logo

When you join Verizon You want more out of a career. A place to share your ideas freely — even if they’re daring or different. Where the true you can learn, grow, and thrive. At Verizon, we power and empower how people live, work and play by connecting them to what brings them joy. We do what we love — driving innovation, creativity, and impact in the world. Our V Team is a community of people who anticipate, lead, and believe that listening is where learning begins. In crisis and in celebration, we come together — lifting our communities and building trust in how we show up, everywhere & always. Want in? Join the #VTeamLife. What You’ll Be Doing... As a DMTS (AI Science) you will own and drive end to end solutions for Cognitive and Gen AI driven use cases. Working on designing and building scalable cognitive and generative AI solutions to meet the needs of given Business engagement. Providing technical thought leadership on model architecture, delivery, monitoring, measurement and model lifecycle best practices. Working in collaborative environment with global teams to drive solutioning of business problems. Developing end to end analytical solutions, and articulating insights to leadership. Provide data-driven recommendations to business by clearly articulating complex modeling concepts through generation and delivery of presentations. Analyzing and model both structured and unstructured data from a number of distributed client and publicly available sources. Assisting with the mentorship and development of Junior members. Drive team towards solutions. Assisting in growing data science practice in Verizon, by meeting business goals through client prospecting, responding to model POC, identifying and closing opportunities within identified Insights, writing white papers, exploring new tools and defining best practices. What We’re Looking For... You have strong ML/NLP/GenAI skills and are eager to work in a collaborative environment with global teams to drive NLP/GenAI application in business problems. You work independently and are always willing to learn new technologies. You thrive in a dynamic environment and are able to interact with various stakeholders and cross functional teams to implement data science driven business solutions. You take pride in your role as a data scientist and evangelist and enjoy adding to the systems, concepts and models that enrich the practice. You enjoy mentoring and empowering the team to expand their technical capabilities. You’ll Need To Have Bachelor’s degree or four or more years of work experience. Six or more years of work experience. Data Scientist and thought leader with experience in implementing production use cases in Gen AI and Cognitive. Ten or more years of hands-on experience on implementation of large-scale NLP Projects and Fine tuning & Evaluation of LLMs for downstream tasks such as text generation, Classification, summarization, question answering, entity extraction etc. Working knowledge of Agentic AI frameworks like LangChain, LangGraph, CrewAI etc. Ability to guide the team to correctly analyze cognitive insights and leverage unstructured conversational data to create transformative, intelligent, context aware and adaptive AI systems. Experience in Machine Learning, Deep Learning model development & deployment from scratch in Python. Working knowledge of NLP frameworks and libraries like NLTK, Spacy, Transformers, Pytorch, Tensorflow, hugging face API's. Working knowledge of various supervised and unsupervised ML algorithms. Should know the various data preprocessing techniques and its impact on algorithm's accuracy, precision and recall. Knowledge & Implementation Experience of Deep Learning i.e Convolutional Neural Nets (CNN), Recursive Neural Nets (RNN) & Long Short-Term Memory (LSTM), Generative Adversarial Networks (GAN), Deep Reinforcement Learning. Experience with RESTful, JSON API services. Working knowledge on Word embeddings, TF-IDF, Tokenization, N-Grams, Stemmers, lemmatization, Part of speech tagging, entity resolution, ontology, lexicology, phonetics, intents, entities, and context. Experience in analyzing Live Chat/call conversation with agents. Expertise in Python, Sql, PySpark, Scala and/or other languages and tools. Understanding of validation framework for generative model output and perspective on future ready systems to scale validation. Familiarity with GPU/CPU architecture and distributed computing and general infra needs to scale Gen AI models Ability to provide technical thought leadership on model architecture, delivery, monitoring, measurement and model lifecycle best practices. Even better if you have one or more of the following: Phd or an advanced degree or specialization in Artificial Intelligence If Verizon and this role sound like a fit for you, we encourage you to apply even if you don’t meet every “even better” qualification listed above. Where you’ll be working In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager. Scheduled Weekly Hours 40 Equal Employment Opportunity Verizon is an equal opportunity employer. We evaluate qualified applicants without regard to race, gender, disability or any other legally protected characteristics. Show more Show less

Posted 1 week ago

Apply

2.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

TomTom is currently seeking a Data Engineer to join our global Reporting, Analytics and Data Engineering team. In this role, you will contribute to delivering the most current, accurate, and detailed maps and location services for millions of drivers and users globally, supporting the advancement of autonomous driving. The Reporting, Analytics and Data Engineering team operates on a global scale, focusing on developing cutting-edge data products and services that provide comprehensive insights into map production efficiency, quality, and coverage. The impact you'll make The Data Engineer gained a strong software engineering skillset and has a good understanding of the product and the systems her team owns. She consistently delivers small to medium product improvements with little to no guidance and contributes to improving the operational aspects of her team’s systems. The data engineer estimates effort and identifies risks with reasonable accuracy, furthermore she actively participates in priority decisions and in solution and system designs beyond the scope of her own deliverables. The data engineer takes ownership for her growth and seeks opportunities for working outside of her comfort zone. She mentors more junior engineers selflessly and knows that success happens through the effectiveness of the whole team, not individual heroics. She contributes constructively to the community of Software Engineers primarily inside TomTom and occasionally outside. Summarized you have/can T-shaped skills, where SW engineering fundamentals are coupled with deep knowledge of specific technologies Show accountable behavior for successful delivery of own work Show accountable behavior for fit-for-purpose implementation of processes, policies and governance in the team Show leadership by taking ownership of and improving medium team-internal processes and ways of working or by leading medium team-internal changes What You'll Need 2+ years of experience in Software Development, most of it should be in Data Engineering. Proficiency in Python and Scala Strong knowledge in Modern Big Data Architecture and Technologies and Experience supporting technologies like Spark, Hive, Hbase, Databricks, Kafka, Unity Catalog Strong working experience in DevOps environment, and passion for CI/CD tools, Azure cloud computing and Azure Data Factory Knowledge of SQL databases and NoSQL databases. Strong understanding of industry technology (analytics, monitoring, code deployment, system scalability, load balancers, web servers) Makes meaningful contributions to data engineering projects. Strong English written and verbal communication skills, and the ability to communicate effectively in a global work culture. The ability to drive issues to resolution through communication, collaboration. Strive for providing solutions of high quality (e.g., logical, testable, maintainable, efficient, documented). Being responsible and accountable of individual work or work involved in Pair Programing. Identify risks and raise it in common forum. Continuously learn and share the knowledge. Being honest and transparent is the key. Nice to have Continuously learn and share the knowledge. Being honest and transparent is the key. What We Offer A competitive compensation package, of course. Time and resources to grow and develop, including a personal development budget and paid leave for learning days, as well as paid access to e-learning resources such as O’Reilly and LinkedIn Learning. Time to support life outside of work, with enhanced parental leave plus paid leave to care for loved ones and volunteer in local communities. Work flexibility, where TomTom’ers, in agreement with their manager and team, use both the office and home to focus, collaborate, learn and socialize. It’s all about getting the best out of both worlds – we ask TomTom’ers to come to the office two days a week, and the remaining three are free to be worked in either location. Improve your home office with a setup budget and get extra support with a monthly allowance. Enjoy options to work from your home country and abroad for a set number of days each year, to visit family and friends, or to simply explore the world we’re mapping. Take the holidays you want with a competitive holiday plan, plus an extra day off to celebrate your birthday. Join annual events like our Hackathon and DevDays to bring your ideas to life with talented teammates from around the world. Become a part of our inclusive global culture and have the chance to collaborate with a diverse community – we have over 80 nationalities at TomTom! Find out more about our global benefits and enjoy additional local benefits tailored to your location. Meet your team We’re Maps, a global team within TomTom’s Location Technology Products technical unit. Our team is driven to deliver the most up-to-date, accurate and detailed maps for hundreds of millions of users around the world. Joining our team, you’ll continuously innovate our mapmaking processes, directly contributing to our vision engineering the world's most trusted and useful map. At TomTom... You’ll help people find their way in the world. In 2004, TomTom revolutionized how the world moves with the introduction of the first portable navigation device. Now, we intend to do it again by engineering the first-ever real-time map, the smartest and most useful map on the planet. Work with a team of 3,700 unique, curious and passionate problem-solvers. Together, we’ll open up a world of possibilities for car manufacturers, enterprises and developers to help people understand and get closer to the world around them. After you apply Our recruitment team will work hard to give you a meaningful experience throughout your journey with us, no matter the outcome. Your application will be screened closely and you can rest assured that all follow-up actions will be thorough, from assessments and interviews all the way through onboarding. To find out more about our application process, check out our hiring FAQs . TomTom is an equal opportunity employer TomTom is where you can find your place in the world. Every day we welcome, nurture and celebrate differences. Why? Because your uniqueness is what makes you, you . No matter your culture or background, you’ll find your impact at TomTom. Research also shows that sometimes women and underrepresented communities can be hesitant to apply for positions unless they believe they meet 100% of the criteria. If you can relate to this, please know that we’d love to hear from you. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio; including Software and Red Hat. Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. In your role, you'll be encouraged to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in ground breaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and development opportunities in an environment that embraces your unique skills and experience. Your Role And Responsibilities As an Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In This Role, Your Responsibilities May Include Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Preferred Education Master's Degree Required Technical And Professional Expertise Good Hands on experience in DBT is required. ETL Datastage and snowflake – preferred. Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred Technical And Professional Experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences Show more Show less

Posted 1 week ago

Apply

12.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In data analysis at PwC, you will focus on utilising advanced analytical techniques to extract insights from large datasets and drive data-driven decision-making. You will leverage skills in data manipulation, visualisation, and statistical modelling to support clients in solving complex business problems. Years of Experience: Candidates with 12+ years of hands on experience Position: Senior Manager Required Skills: Successful candidates will have demonstrated the following skills and characteristics: Must Have Deep expertise in AI/ML solution design, including supervised and unsupervised learning, deep learning, NLP, and optimization. Strong hands-on experience with ML/DL frameworks like TensorFlow, PyTorch, scikit-learn, H2O, and XGBoost. Solid programming skills in Python, PySpark, and SQL, with a strong foundation in software engineering principles. Proven track record of building end-to-end AI pipelines, including data ingestion, model training, testing, and production deployment. Experience with MLOps tools such as MLflow, Airflow, DVC, and Kubeflow for model tracking, versioning, and monitoring. Understanding of big data technologies like Apache Spark, Hive, and Delta Lake for scalable model development. Expertise in AI solution deployment across cloud platforms like GCP, AWS, and Azure using services like Vertex AI, SageMaker, and Azure ML. Experience in REST API development, NoSQL database design, and RDBMS design and optimizations. Familiarity with API-based AI integration and containerization technologies like Docker and Kubernetes. Proficiency in data storytelling and visualization tools such as Tableau, Power BI, Looker, and Streamlit. Programming skills in Python and either Scala or R, with experience using Flask and FastAPI. Experience with software engineering practices, including use of GitHub, CI/CD, code testing, and analysis. Proficient in using AI/ML frameworks such as TensorFlow, PyTorch, and SciKit-Learn. Skilled in using Apache Spark, including PySpark and Databricks, for big data processing. Strong understanding of foundational data science concepts, including statistics, linear algebra, and machine learning principles. Knowledgeable in integrating DevOps, MLOps, and DataOps practices to enhance operational efficiency and model deployment. Experience with cloud infrastructure services like Azure and GCP. Proficiency in containerization technologies such as Docker and Kubernetes. Familiarity with observability and monitoring tools like Prometheus and the ELK stack, adhering to SRE principles and techniques. Cloud or Data Engineering certifications or specialization certifications (e.g. Google Professional Machine Learning Engineer, Microsoft Certified: Azure AI Engineer Associate – Exam AI-102, AWS Certified Machine Learning – Specialty (MLS-C01), Databricks Certified Machine Learning) Nice To Have Experience implementing generative AI, LLMs, or advanced NLP use cases Exposure to real-time AI systems, edge deployment, or federated learning Strong executive presence and experience communicating with senior leadership or CXO-level clients Roles And Responsibilities Lead and oversee complex AI/ML programs, ensuring alignment with business strategy and delivering measurable outcomes. Serve as a strategic advisor to clients on AI adoption, architecture decisions, and responsible AI practices. Design and review scalable AI architectures, ensuring performance, security, and compliance. Supervise the development of machine learning pipelines, enabling model training, retraining, monitoring, and automation. Present technical solutions and business value to executive stakeholders through impactful storytelling and data visualization. Build, mentor, and lead high-performing teams of data scientists, ML engineers, and analysts. Drive innovation and capability development in areas such as generative AI, optimization, and real-time analytics. Contribute to business development efforts, including proposal creation, thought leadership, and client engagements. Partner effectively with cross-functional teams to develop, operationalize, integrate, and scale new algorithmic products. Develop code, CI/CD, and MLOps pipelines, including automated tests, and deploy models to cloud compute endpoints. Manage cloud resources and build accelerators to enable other engineers, with experience in working across two hyperscale clouds. Demonstrate effective communication skills, coaching and leading junior engineers, with a successful track record of building production-grade AI products for large organizations. Professional And Educational Background BE / B.Tech / MCA / M.Sc / M.E / M.Tech /Master’s Degree /MBA from reputed institute Show more Show less

Posted 1 week ago

Apply

6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Description We are seeking a highly skilled Senior Kafka and Scala Developer to join our dynamic team. The ideal candidate will have extensive experience in developing and maintaining scalable, high-performance systems using Kafka and Scala. You will be responsible for designing, implementing, and optimizing data pipelines and distributed systems. Key Responsibilities Design, develop, and maintain scalable data pipelines using Kafka and Scala. Collaborate with cross-functional teams to gather requirements and deliver high-quality solutions. Optimize and tune Kafka clusters for performance and reliability. Implement best practices for data processing, storage, and retrieval. Troubleshoot and resolve issues related to Kafka and Scala applications. Mentor and guide junior developers in the team. Qualifications Bachelor's or Master's degree in Computer Science, Engineering, or a related field. 6+ years of experience in software development with a strong focus on Kafka and Scala. Proficiency in designing and implementing distributed systems. Strong understanding of data structures, algorithms, and software design principles. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Preferred Skills Experience with cloud platforms such as AWS, Azure, or Google Cloud. Experience with other big data technologies such as Hadoop, Spark, and Cassandra is a plus. Knowledge of containerization technologies like Docker and Kubernetes. Familiarity with CI/CD pipelines and DevOps practices. Understanding of machine learning and data analytics. Show more Show less

Posted 1 week ago

Apply

7.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Teamwork makes the stream work. Roku is changing how the world watches TV Roku is the #1 TV streaming platform in the U.S., Canada, and Mexico, and we've set our sights on powering every television in the world. Roku pioneered streaming to the TV. Our mission is to be the TV streaming platform that connects the entire TV ecosystem. We connect consumers to the content they love, enable content publishers to build and monetize large audiences, and provide advertisers unique capabilities to engage consumers. From your first day at Roku, you'll make a valuable - and valued - contribution. We're a fast-growing public company where no one is a bystander. We offer you the opportunity to delight millions of TV streamers around the world while gaining meaningful experience across a variety of disciplines. About the team Roku runs one of the largest data lakes in the world. We store over 70 PB of data, run 10+M queries per month, scan over 100 PB of data per month. Big Data team is the one responsible for building, running, and supporting the platform that makes this possible. We provide all the tools needed to acquire, generate, process, monitor, validate and access the data in the lake for both streaming data and batch. We are also responsible for generating the foundational data. The systems we provide include Scribe, Kafka, Hive, Presto, Spark, Flink, Pinot, and others. The team is actively involved in the Open Source, and we are planning to increase our engagement over time. About the Role Roku is in the process of modernizing its Big Data Platform. We are working on defining the new architecture to improve user experience, minimize the cost and increase efficiency. Are you interested in helping us build this state-of-the-art big data platform? Are you an expert with Big Data Technologies? Have you looked under the hood of these systems? Are you interested in Open Source? If you answered “Yes” to these questions, this role is for you! What you will be doing You will be responsible for streamlining and tuning existing Big Data systems and pipelines and building new ones. Making sure the systems run efficiently and with minimal cost is a top priority You will be making changes to the underlying systems and if an opportunity arises, you can contribute your work back into the open source You will also be responsible for supporting internal customers and on-call services for the systems we host. Making sure we provided stable environment and great user experience is another top priority for the team We are excited if you have 7+ years of production experience building big data platforms based upon Spark, Trino or equivalent Strong programming expertise in Java, Scala, Kotlin or another JVM language. A robust grasp of distributed systems concepts, algorithms, and data structures Strong familiarity with the Apache Hadoop ecosystem: Spark, Kafka, Hive/Iceberg/Delta Lake, Presto/Trino, Pinot, etc. Experience working with at least 3 of the technologies/tools mentioned here: Big Data / Hadoop, Kafka, Spark, Trino, Flink, Airflow, Druid, Hive, Iceberg, Delta Lake, Pinot, Storm etc Extensive hands-on experience with public cloud AWS or GCP BS/MS degree in CS or equivalent Benefits Roku is committed to offering a diverse range of benefits as part of our compensation package to support our employees and their families. Our comprehensive benefits include global access to mental health and financial wellness support and resources. Local benefits include statutory and voluntary benefits which may include healthcare (medical, dental, and vision), life, accident, disability, commuter, and retirement options (401(k)/pension). Our employees can take time off work for vacation and other personal reasons to balance their evolving work and life needs. It's important to note that not every benefit is available in all locations or for every role. For details specific to your location, please consult with your recruiter. The Roku Culture Roku is a great place for people who want to work in a fast-paced environment where everyone is focused on the company's success rather than their own. We try to surround ourselves with people who are great at their jobs, who are easy to work with, and who keep their egos in check. We appreciate a sense of humor. We believe a fewer number of very talented folks can do more for less cost than a larger number of less talented teams. We're independent thinkers with big ideas who act boldly, move fast and accomplish extraordinary things through collaboration and trust. In short, at Roku you'll be part of a company that's changing how the world watches TV. We have a unique culture that we are proud of. We think of ourselves primarily as problem-solvers, which itself is a two-part idea. We come up with the solution, but the solution isn't real until it is built and delivered to the customer. That penchant for action gives us a pragmatic approach to innovation, one that has served us well since 2002. To learn more about Roku, our global footprint, and how we've grown, visit https://www.weareroku.com/factsheet. By providing your information, you acknowledge that you have read our Applicant Privacy Notice and authorize Roku to process your data subject to those terms. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

The HiLabs Story HiLabs is a leading provider of AI-powered solutions to clean dirty data, unlocking its hidden potential for healthcare transformation. HiLabs is committed to transforming the healthcare industry through innovation, collaboration, and a relentless focus on improving patient outcomes. HiLabs Team Multidisciplinary industry leaders Healthcare domain experts AI/ML and data science experts Professionals hailing from the worlds best universities, business schools, and engineering institutes including Harvard, Yale, Carnegie Mellon, Duke, Georgia Tech, Indian Institute of Management (IIM), and Indian Institute of Technology (IIT). As a Data Scientist at HiLabs, you will partner closely with a team of stake holders, product managers, and data engineers to solve real problems in healthcare domain. HiLabs is looking for great problem solvers who can tap into the potential of data and deliver scalable and robust data products. Our potential member would work on developing and implementing machine learning algorithms for various HiLabs Products and Solutions. Responsibilities Leveraging AI/ML techniques and solutions to identify and mathematically interpret complex healthcare problems. Deploy and optimize machine learning solutions on massive datasets using big data technologies. Develop and prototype AI algorithms and software tools. Implement enhancements to existing algorithmic/deep learning solutions. Conduct quantitative data analysis using a variety of datasets. Increase the efficiency and improve the quality of solutions offered. Understand business use cases and pull the necessary data from various sources to provide key insights to stakeholders. Build and deploy additional functionalities as per client requirements. Leverage recent advances in machine learning technologies and oversee the team’s training. For Lead/Sr. Data Scientists-Lead a team of Data Scientists, developers as well as clinicians to strategize, design and evaluate AI based solutions to healthcare problems. Desired Profile:. Bachelor’s Degree/Master’s Degree in Computer Science, Mathematics, Statistics, Physics, Electrical Engineering, Computer Engineering or related fields from tier 1 college. Hands-on Software Development Skills (Scala-Preferred). Experience or educational courses/projects in Machine Learning, and/or Text Mining Algorithms. Knowledge of statistical techniques, linear algebra, numerical optimization. Knowledge of ML theory such as bias variance tradeoff, regularization, loss functions and experienced in A/B testing. Know all the general best practices of Machine Learning. Ability to work closely with Domain experts to develop tools/algorithms needed to answer research questions in their studies. Excellent Communication Skills (with the ability to explain developed tools and ML algorithms to a non-technical audience). Ability to formulate operational problems in healthcare domain as technical problems that allows for reuse of leading research in the area. Proven ability to work independently to learn new technologies, techniques, processes, languages, platforms, systems. Strong analytical, inferential, critical thinking, and creative problem-solving skills. Self-starter with ability to work both independently and with a team. Preferred Qualifications Experience with Deep Learning Frameworks such as Keras, Tensorflow, PyTorch,Mxnet etc. Experience with interpretability of deep learning models. Big Data Skills (Hadoop, Spark, recent deep learning platforms). Experience with text mining tools and techniques including in areas of summarization, search (e.g. ELK Stack), entity extraction, training set generation (e.g. Snorkel) and anomaly detection. Expert software development skills including developing and maintaining production quality code HiLabs Total Rewards Competitive Salary, Accelerated Incentive Policies, H1B sponsorship, Comprehensive benefits package that includes ESOPs, financial contribution for your ongoing professional and personal development, medical coverage for you and your loved ones, 401k, PTOs & a collaborative working environment, Smart mentorship, and highly qualified multidisciplinary, incredibly talented professionals from highly renowned and accredited medical schools, business schools, and engineering institutes. CCPA disclosure notice - https://www.hilabs.com/privacy Show more Show less

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

A career in our Advisory Acceleration Centre is the natural extension of PwC’s leading class global delivery capabilities. We provide premium, cost effective, high quality services that support process quality and delivery capability in support for client engagements. Years of Experience: Candidates with 4-8 years of hands on experience Position Requirements Must Have : Experience in architecting and delivering highly scalable, distributed, cloud-based enterprise data solutions Strong expertise in end-to-end implementation of Cloud data engineering solutions like Enterprise Data lake, Data hub in AWS Proficient in Lambda or Kappa Architectures Should be aware of Data Management concepts and Data Modelling Strong AWS hands-on expertise with a programming background preferably Python/Scala Good knowledge of Big Data frameworks and related technologies - Experience in Hadoop and Spark is mandatory Strong experience in AWS compute services like AWS EMR, Glue and storage services like S3, Redshift & Dynamodb Good experience with any one of the AWS Streaming Services like AWS Kinesis, AWS SQS and AWS MSK Troubleshooting and Performance tuning experience in Spark framework - Spark core, Sql and Spark Streaming Strong understanding of DBT ELT Tool, and usage of DBT macros etc Good knowledge of Application DevOps tools (Git, CI/CD Frameworks) - Experience in Jenkins or Gitlab with rich experience in source code management like Code Pipeline, Code Build and Code Commit Experience with AWS CloudWatch, AWS Cloud Trail, AWS Account Config, AWS Config Rules Good knowledge in AWS Security and AWS Key management Strong understanding of Cloud data migration processes, methods and project lifecycle Good analytical & problem-solving skills Good communication and presentation skills Education : Any Graduate. Good analytical & problem-solving skills Good communication and presentation skills Show more Show less

Posted 1 week ago

Apply

12.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Description Context : In modern banking age financial institutions need to bring Classical Data Drivers and Evolving Business Drivers together in a single platform. These drivers also need to communicate with each other and share the data products for enterprise consumption. Traditional data platforms are able to handle classical data drivers well but fail to communicate with evolving business drivers due to limitations of technologies and implementation approaches. Modern Data Platform helps to fill this gap, and take the business to the next level of growth and expansion using data driven approaches. The technology transformation in recent years make such implementations feasible. Your Opportunity You will be responsible for leading the Modern Data Platform Practice, that would involve providing solutions to customers on Tradition Datawarehouses. Platforms on Prem and Cloud. It would cover aspects of Architecting the Data Platforms, defining Data engineering design, choosing appropriate technology and tools across on-Prem and Cloud services. Help the organization to strengthen the Modern Data Platform capabilities, lead the Pre-sales discussion on data platforms, provide the technology architecture in the RFP responses and lead the technology POC/MVP. Your Qualifications: We expect you to have following qualifications and experiences to be able to effectively perform the suggested role: A Technology leader with an engineering academic background in Computer Science / Information Technology / Data Technologies [BE/BTech/MCA] Overall 12-16 years of Data Engineering and analytics experience as individual contributor as well as Technology / Architecture lead A minimum of 5-7 years of hands-on experience in Big Data systems across On-Prem and Cloud environments Should have led Data Platform architecture & design projects for a mid to large size firms Have experience of implementing Batch Data and Streaming / Online data integrations using 3rd party tools and custom programs A good hands-on experience on SQL and one of the programming language: Core Java / Scala / Python. A good hands-on experience in Kafka for enabling Event driven data pipes / processing Knowledge of leading Data Sevices offered by AWS, Azure, Snowflake, Confluent Thorough understanding on distributed computing and related data structures Should have implemented Data Governance and Quality capabilities for a Data Platform (for On-Prem and or Cloud ) Good analytical skills and presentation skills Experience in building the team from the ground up Good exposure of leading RDBMS technologies and Data Visualization platforms Should have demonstrated AI/ML models for Data Processing and generating Insights for end users Good great teammate and ability to work on own initiatives with minimal direction Career Level - IC4 Responsibilities Diversity and Inclusion: An Oracle career can span industries, roles, Countries and cultures, giving you the opportunity to flourish in new roles and innovate, while blending work life in. Oracle has thrived through 40+ years of change by innovating and operating with integrity while delivering for the top companies in almost every industry. In order to nurture the talent that makes this happen, we are committed to an inclusive culture that celebrates and values diverse insights and perspectives, a workforce that inspires thought leadership and innovation. Oracle offers a highly competitive suite of Employee Benefits designed on the principles of parity, consistency, and affordability. The overall package includes certain core elements such as Medical, Life Insurance, access to Retirement Planning, and much more. We also encourage our employees to engage in the culture of giving back to the communities where we live and do business. At Oracle, we believe that innovation starts with diversity and inclusion and to create the future we need talent from various backgrounds, perspectives, and abilities. We ensure that individuals with disabilities are provided reasonable accommodation to successfully participate in the job application, interview process, and in potential roles. to perform crucial job functions. That’s why we’re committed to creating a workforce where all individuals can do their best work. It’s when everyone’s voice is heard and valued that we’re inspired to go beyond what’s been done before. About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less

Posted 1 week ago

Apply

9.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Location : Pune About Team & About Role As a Senior Software Engineer (SSE) in the Continuous Product Development (CPD) team, you will play a key role in leading team(s) towards owning the roadmap, providing long-term stability, and providing delight to our enterprise customers. You will work closely with leadership and multiple stakeholders from other engineering teams, the Product and Support organizations. You will be working across Rubrik releases on our on-premise data backup & SAAS offering. You are expected to develop a strong understanding of our product and engineering architecture, such as our distributed job framework, data lifecycle management, filesystem, and metadata store. We are seeking a highly skilled senior engineer to join our team. You will be responsible for developing and maintaining high-performance software applications. You should have strong programming and troubleshooting skills, excellent design skills, and an understanding of distributed systems. You should be able to work independently and as part of a team. Having an understanding of the storage domain will be preferred, but is not necessary. Rubrik SSEs are self-starters, driven, and can manage themselves. We believe in giving engineers responsibility, not tasks. Our goal is to motivate and challenge you to do your best work by empowering you to make your own decisions. To do that, we have a very transparent structure and give people freedom to exercise their judgment, even in critical scenarios. This develops more capable engineers and keeps everyone engaged and happy, ultimately leading to customer delight. Key Responsibilities Design, develop, and maintain high-quality software applications and libraries using C++, Scala, and Go programming languages. Troubleshoot complex software problems in a timely and accurate manner. Collaborate with cross-functional teams to define, design, and ship new features. Write and maintain technical documentation for software systems and applications. Participate in code reviews and ensure adherence to coding standards. Continuously improve software quality through process improvement initiatives. Keep up-to-date with emerging trends in software development. Requirements B-Tech/M-Tech with 9-13 years of experience. Strong programming, problem-solving, and troubleshooting skills. Language skills: C++ or Scala/Java, or C/Go with understanding of OOP Excellent design skills. Understanding of distributed systems and multi-threading/concurrency concepts. Preferably, have a good understanding of the storage domain. Preferably, have a strong background in the object-oriented paradigm. Good knowledge of data structures, algorithms, and design patterns. Good understanding of networking protocols and security concepts. Good knowledge of software development methodologies, tools, and processes. Strong communication skills and the ability to work in a team environment. Join Us in Securing the World's Data Rubrik (NYSE: RBRK) is on a mission to secure the world’s data. With Zero Trust Data Security™, we help organizations achieve business resilience against cyberattacks, malicious insiders, and operational disruptions. Rubrik Security Cloud, powered by machine learning, secures data across enterprise, cloud, and SaaS applications. We help organizations uphold data integrity, deliver data availability that withstands adverse conditions, continuously monitor data risks and threats, and restore businesses with their data when infrastructure is attacked. Linkedin | X (formerly Twitter) | Instagram | Rubrik.com Inclusion @ Rubrik At Rubrik, we are dedicated to fostering a culture where people from all backgrounds are valued, feel they belong, and believe they can succeed. Our commitment to inclusion is at the heart of our mission to secure the world’s data. Our goal is to hire and promote the best talent, regardless of background. We continually review our hiring practices to ensure fairness and strive to create an environment where every employee has equal access to opportunities for growth and excellence. We believe in empowering everyone to bring their authentic selves to work and achieve their fullest potential. Our inclusion strategy focuses on three core areas of our business and culture: Our Company: We are committed to building a merit-based organization that offers equal access to growth and success for all employees globally. Your potential is limitless here. Our Culture: We strive to create an inclusive atmosphere where individuals from all backgrounds feel a strong sense of belonging, can thrive, and do their best work. Your contributions help us innovate and break boundaries. Our Communities: We are dedicated to expanding our engagement with the communities we operate in, creating opportunities for underrepresented talent and driving greater innovation for our clients. Your impact extends beyond Rubrik, contributing to safer and stronger communities. Equal Opportunity Employer/Veterans/Disabled Rubrik is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, or protected veteran status and will not be discriminated against on the basis of disability. Rubrik provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, sex, national origin, age, disability or genetics. In addition to federal law requirements, Rubrik complies with applicable state and local laws governing nondiscrimination in employment in every location in which the company has facilities. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training. Federal law requires employers to provide reasonable accommodation to qualified individuals with disabilities. Please contact us at hr@rubrik.com if you require a reasonable accommodation to apply for a job or to perform your job. Examples of reasonable accommodation include making a change to the application process or work procedures, providing documents in an alternate format, using a sign language interpreter, or using specialized equipment. EEO IS THE LAW NOTIFICATION OF EMPLOYEE RIGHTS UNDER FEDERAL LABOR LAWS Show more Show less

Posted 1 week ago

Apply

3.0 - 10.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Creating business intelligence from data requires an understanding of the business, the data, and the technology used to store and analyse that data. Using our Rapid Business Intelligence Solutions, data visualisation and integrated reporting dashboards, we can deliver agile, highly interactive reporting and analytics that help our clients to more effectively run their business and understand what business questions can be answered and how to unlock the answers. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. Responsibilities: Job Description:  Analyses current business practices, processes, and procedures as well as identifying future business opportunities for leveraging Microsoft Azure Data & Analytics Services.  Provide technical leadership and thought leadership as a senior member of the Analytics Practice in areas such as data access & ingestion, data processing, data integration, data modeling, database design & implementation, data visualization, and advanced analytics.  Engage and collaborate with customers to understand business requirements/use cases and translate them into detailed technical specifications.  Develop best practices including reusable code, libraries, patterns, and consumable frameworks for cloud-based data warehousing and ETL.  Maintain best practice standards for the development or cloud-based data warehouse solutioning including naming standards.  Designing and implementing highly performant data pipelines from multiple sources using Apache Spark and/or Azure Databricks  Integrating the end-to-end data pipeline to take data from source systems to target data repositories ensuring the quality and consistency of data is always maintained  Working with other members of the project team to support delivery of additional project components (API interfaces)  Evaluating the performance and applicability of multiple tools against customer requirements  Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints.  Integrate Databricks with other technologies (Ingestion tools, Visualization tools).  Proven experience working as a data engineer  Highly proficient in using the spark framework (python and/or Scala)  Extensive knowledge of Data Warehousing concepts, strategies, methodologies.  Direct experience of building data pipelines using Azure Data Factory and Apache Spark (preferably in Databricks).  Hands on experience designing and delivering solutions using Azure including Azure Storage, Azure SQL Data Warehouse, Azure Data Lake, Azure Cosmos DB, Azure Stream Analytics  Experience in designing and hands-on development in cloud-based analytics solutions.  Expert level understanding on Azure Data Factory, Azure Synapse, Azure SQL, Azure Data Lake, and Azure App Service is required.  Designing and building of data pipelines using API ingestion and Streaming ingestion methods.  Knowledge of Dev-Ops processes (including CI/CD) and Infrastructure as code is essential.  Thorough understanding of Azure Cloud Infrastructure offerings.  Strong experience in common data warehouse modelling principles including Kimball.  Working knowledge of Python is desirable  Experience developing security models.  Databricks & Azure Big Data Architecture Certification would be plus  Must be team oriented with strong collaboration, Prioritization, And Adaptability Skills Required Mandatory skill sets: Azure Databricks Preferred Skill Sets Azure Databricks Years Of Experience Required 3-10 Years Education Qualification BE, B.Tech, MCA, M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Business Administration Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Microsoft Azure Databricks Optional Skills Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less

Posted 1 week ago

Apply

3.0 - 10.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Creating business intelligence from data requires an understanding of the business, the data, and the technology used to store and analyse that data. Using our Rapid Business Intelligence Solutions, data visualisation and integrated reporting dashboards, we can deliver agile, highly interactive reporting and analytics that help our clients to more effectively run their business and understand what business questions can be answered and how to unlock the answers. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities Analyses current business practices, processes, and procedures as well as identifying future business opportunities for leveraging Microsoft Azure Data & Analytics Services. Provide technical leadership and thought leadership as a senior member of the Analytics Practice in areas such as data access & ingestion, data processing, data integration, data modeling, database design & implementation, data visualization, and advanced analytics. Engage and collaborate with customers to understand business requirements/use cases and translate them into detailed technical specifications. Develop best practices including reusable code, libraries, patterns, and consumable frameworks for cloud-based data warehousing and ETL. Maintain best practice standards for the development or cloud-based data warehouse solutioning including naming standards. Designing and implementing highly performant data pipelines from multiple sources using Apache Spark and/or Azure Databricks Integrating the end-to-end data pipeline to take data from source systems to target data repositories ensuring the quality and consistency of data is always maintained Working with other members of the project team to support delivery of additional project components (API interfaces) Evaluating the performance and applicability of multiple tools against customer requirements Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints. Integrate Databricks with other technologies (Ingestion tools, Visualization tools). Proven experience working as a data engineer Highly proficient in using the spark framework (python and/or Scala) Extensive knowledge of Data Warehousing concepts, strategies, methodologies. Direct experience of building data pipelines using Azure Data Factory and Apache Spark (preferably in Databricks). Hands on experience designing and delivering solutions using Azure including Azure Storage, Azure SQL Data Warehouse, Azure Data Lake, Azure Cosmos DB, Azure Stream Analytics Experience in designing and hands-on development in cloud-based analytics solutions. Expert level understanding on Azure Data Factory, Azure Synapse, Azure SQL, Azure Data Lake, and Azure App Service is required. Designing and building of data pipelines using API ingestion and Streaming ingestion methods. Knowledge of Dev-Ops processes (including CI/CD) and Infrastructure as code is essential. Thorough understanding of Azure Cloud Infrastructure offerings. Strong experience in common data warehouse modeling principles including Kimball. Working knowledge of Python is desirable Experience developing security models. Databricks & Azure Big Data Architecture Certification would be plus Must be team oriented with strong collaboration, prioritization, and adaptability skills required Mandatory Skill Sets ADE, ADB, ADF Preferred Skill Sets ADE, ADB, ADF Years Of Experience Required 3-10 Years Education Qualification BE, B.Tech, MCA, M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Bachelor of Technology, Master of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Azure Data Factory, Data Engineering, Microsoft Azure Databricks Optional Skills Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less

Posted 1 week ago

Apply

Exploring Scala Jobs in India

Scala is a popular programming language that is widely used in India, especially in the tech industry. Job seekers looking for opportunities in Scala can find a variety of roles across different cities in the country. In this article, we will dive into the Scala job market in India and provide valuable insights for job seekers.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Chennai
  5. Mumbai

These cities are known for their thriving tech ecosystem and have a high demand for Scala professionals.

Average Salary Range

The salary range for Scala professionals in India varies based on experience levels. Entry-level Scala developers can expect to earn around INR 6-8 lakhs per annum, while experienced professionals with 5+ years of experience can earn upwards of INR 15 lakhs per annum.

Career Path

In the Scala job market, a typical career path may look like: - Junior Developer - Scala Developer - Senior Developer - Tech Lead

As professionals gain more experience and expertise in Scala, they can progress to higher roles with increased responsibilities.

Related Skills

In addition to Scala expertise, employers often look for candidates with the following skills: - Java - Spark - Akka - Play Framework - Functional programming concepts

Having a good understanding of these related skills can enhance a candidate's profile and increase their chances of landing a Scala job.

Interview Questions

Here are 25 interview questions that you may encounter when applying for Scala roles:

  • What is Scala and why is it used? (basic)
  • Explain the difference between val and var in Scala. (basic)
  • What is pattern matching in Scala? (medium)
  • What are higher-order functions in Scala? (medium)
  • How does Scala support functional programming? (medium)
  • What is a case class in Scala? (basic)
  • Explain the concept of currying in Scala. (advanced)
  • What is the difference between map and flatMap in Scala? (medium)
  • How does Scala handle null values? (medium)
  • What is a trait in Scala and how is it different from an abstract class? (medium)
  • Explain the concept of implicits in Scala. (advanced)
  • What is the Akka toolkit and how is it used in Scala? (medium)
  • How does Scala handle concurrency? (advanced)
  • Explain the concept of lazy evaluation in Scala. (advanced)
  • What is the difference between List and Seq in Scala? (medium)
  • How does Scala handle exceptions? (medium)
  • What are Futures in Scala and how are they used for asynchronous programming? (advanced)
  • Explain the concept of type inference in Scala. (medium)
  • What is the difference between object and class in Scala? (basic)
  • How can you create a Singleton object in Scala? (basic)
  • What is a higher-kinded type in Scala? (advanced)
  • Explain the concept of for-comprehensions in Scala. (medium)
  • How does Scala support immutability? (medium)
  • What are the advantages of using Scala over Java? (basic)
  • How do you implement pattern matching in Scala? (medium)

Closing Remark

As you explore Scala jobs in India, remember to showcase your expertise in Scala and related skills during interviews. Prepare well, stay confident, and you'll be on your way to a successful career in Scala. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies