Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 8.0 years
7 - 10 Lacs
Mumbai, Nagpur, Thane
Work from Office
First HM needs strong java candidates (5-8 years exp) who can pick up Scala on the job Need profiles asap Requisition ID P tracker ID HM Location Skill Level YTR YTR Feng Chen Bangalore Strong Java/ Strong Java Scala Level 3 Second - Needs SQL/Python resources as described below- This requirement is been filled. Requisition ID P tracker ID HM Location Skill Level YTR YTR Akshay Deodhar Bangalore RDBMS Python Level 2 HM has updated the job description. Please note you can submit two different skill set profiles If you find all these skills set in one profile will be extraordinary. If not , Please Look for Java Scala or Strong Java Developer. & other profile of SQL /Python Developer. First HM needs strong java candidates (5-8 years exp) who can pick up Scala on the job. Second - Needs SQL/Python resources as described below- Sharing the JD: Exposure in RDBMS platform (writing SQLs, Stored procedure, Data warehousing concepts etc..) Hands-on in Python. Good to have big data exposure (Spark and Hadoop concepts) Good to have Azure cloud level exposure (Databricks or Snowflake). Overall job experience of 3-6 years should be fine.
Posted 1 week ago
5.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
What You’ll Do Design, develop, and operate high scale applications across the full engineering stack. Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Research, create, and develop software applications to extend and improve on Equifax Solutions. Manage sole project priorities, deadlines, and deliverables. Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What Experience You Need Bachelor's degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, Spring Framework, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs Big Data Technologies : Spark/Scala/Hadoop What could set you apart Experience designing and developing big data processing solutions using DataProc, Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others Cloud Certification especially in GCP Self-starter that identifies/responds to priority shifts with minimal supervision. You have excellent leadership and motivational skills You have an inquisitive and innovative mindset with a shown ability to recognize opportunities to create distinctive value You can successfully evaluate workload to drive efficiency Show more Show less
Posted 1 week ago
4.0 - 9.0 years
10 - 15 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
About Us: KPI Partners is a leading provider of data analytics and performance management solutions, dedicated to helping organizations harness the power of their data to drive business success. Our team of experts is at the forefront of the data revolution, delivering innovative solutions to our clients. We are currently seeking a talented and experienced Senior Developer / Lead Data Engineer with expertise in Incorta to join our dynamic team. Job Description: As a Senior Developer / Lead Data Engineer at KPI Partners, you will play a critical role in designing, developing, and implementing data solutions using Incorta. You will work closely with cross-functional teams to understand data requirements, build and optimize data pipelines, and ensure that our data integration processes are efficient and effective. This position requires strong analytical skills, proficiency in Incorta, and a passion for leveraging data to drive business insights. Key Responsibilities: - Design and develop scalable data integration solutions using Incorta. - Collaborate with business stakeholders to gather data requirements and translate them into technical specifications. - Create and optimize data pipelines to ensure high data quality and availability. - Perform data modeling, ETL processes, and data engineering activities to support analytics initiatives. - Troubleshoot and resolve data-related issues across various systems and environments. - Mentor and guide junior developers and data engineers, fostering a culture of learning and collaboration. - Stay updated on industry trends, best practices, and emerging technologies related to data engineering and analytics. - Work with the implementation team to ensure smooth deployment of solutions and provide ongoing support. Qualifications: - Bachelor's or Master's degree in Computer Science, Engineering, Information Systems, or a related field. - 5+ years of experience in data engineering or related roles with a strong focus on Incorta. - Expertise in Incorta and its features, along with experience in data modeling and ETL processes. - Proficiency in SQL and experience with relational databases (e.g., MySQL, Oracle, SQL Server). - Strong analytical and problem-solving skills, with the ability to work with complex data sets. - Excellent communication and collaboration skills to work effectively in a team-oriented environment. - Familiarity with cloud platforms (e.g., AWS, Azure) and data visualization tools is a plus. - Experience with programming languages such as Python, Java, or Scala is advantageous. Why Join KPI Partners? - Opportunity to work with a talented and passionate team in a fast-paced environment. - Competitive salary and benefits package. - Continuous learning and professional development opportunities. - A collaborative and inclusive workplace culture that values diversity and innovation. KPI Partners is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. Join us at KPI Partners and help us unlock the power of data for our clients!
Posted 1 week ago
2.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Description Amazon Music is awash in data! To help make sense of it all, the DISCO (Data, Insights, Science & Optimization) team: (i) enables the Consumer Product Tech org make data driven decisions that improve the customer retention, engagement and experience on Amazon Music. We build and maintain automated self-service data solutions, data science models and deep dive difficult questions that provide actionable insights. We also enable measurement, personalization and experimentation by operating key data programs ranging from attribution pipelines, northstar weblabs metrics to causal frameworks. (ii) delivering exceptional Analytics & Science infrastructure for DISCO teams, fostering a data-driven approach to insights and decision making. As platform builders, we are committed to constructing flexible, reliable, and scalable solutions to empower our customers. (iii) accelerates and facilitates content analytics and provides independence to generate valuable insights in a fast, agile, and accurate way. This domain provides analytical support for the below topics within Amazon Music: Programming / Label Relations / PR / Stations / Livesports / Originals / Case & CAM. DISCO team enables repeatable, easy, in depth analysis of music customer behaviors. We reduce the cost in time and effort of analysis, data set building, model building, and user segmentation. Our goal is to empower all teams at Amazon Music to make data driven decisions and effectively measure their results by providing high quality, high availability data, and democratized data access through self-service tools. If you love the challenges that come with big data then this role is for you. We collect billions of events a day, manage petabyte scale data on Redshift and S3, and develop data pipelines using Spark/Scala EMR, SQL based ETL, Airflow and Java services. We are looking for talented, enthusiastic, and detail-oriented Data Engineer, who knows how to take on big data challenges in an agile way. Duties include big data design and analysis, data modeling, and development, deployment, and operations of big data pipelines. You'll help build Amazon Music's most important data pipelines and data sets, and expand self-service data knowledge and capabilities through an Amazon Music data university. DISCO team develops data specifically for a set of key business domains like personalization and marketing and provides and protects a robust self-service core data experience for all internal customers. We deal in AWS technologies like Redshift, S3, EMR, EC2, DynamoDB, Kinesis Firehose, and Lambda. Your team will manage the data exchange store (Data Lake) and EMR/Spark processing layer using Airflow as orchestrator. You'll build our data university and partner with Product, Marketing, BI, and ML teams to build new behavioural events, pipelines, datasets, models, and reporting to support their initiatives. You'll also continue to develop big data pipelines. Key job responsibilities Deep understanding of data, analytical techniques, and how to connect insights to the business, and you have practical experience in insisting on highest standards on operations in ETL and big data pipelines. With our Amazon Music Unlimited and Prime Music services, and our top music provider spot on the Alexa platform, providing high quality, high availability data to our internal customers is critical to our customer experiences. Assist the DISCO team with management of our existing environment that consists of Redshift and SQL based pipelines. The activities around these systems will be well defined via standard operation procedures (SOP) and typically involve approving data access requests, subscribing or adding new data to the environment SQL data pipeline management (creating or updating existing pipelines) Perform maintenance tasks on the Redshift cluster. Assist the team with the management of our next-generation AWS infrastructure. Tasks includes infrastructure monitoring via CloudWatch alarms, infrastructure maintenance through code changes or enhancements, and troubleshooting/root cause analysis infrastructure issues that arise, and in some cases this resource may also be asked to submit code changes based on infrastructure issues that arise. About The Team Amazon Music is an immersive audio entertainment service that deepens connections between fans, artists, and creators.From personalized music playlists to exclusive podcasts,concert livestreams to artist merch,we are innovating at some of the most exciting intersections of music and culture.We offer experiences that serve all listeners with our different tiers of service:Prime members get access to all music in shuffle mode,and top ad-free podcasts,included with their membership;customers can upgrade to Music Unlimited for unlimited on-demand access to 100 million songs including millions in HD,Ultra HD,spatial audio and anyone can listen for free by downloading Amazon Music app or via Alexa-enabled devices.Join us for opportunity to influence how Amazon Music engages fans, artists,and creators on a global scale. Basic Qualifications 2+ years of data engineering experience Experience with data modeling, warehousing and building ETL pipelines Experience with SQL Experience with one or more scripting language (e.g., Python, KornShell) Experience in Unix Experience in Troubleshooting the issues related to Data and Infrastructure issues. Preferred Qualifications Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc. Knowledge of distributed systems as it pertains to data storage and computing Experience in building or administering reporting/analytics platforms Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI MAA 15 SEZ Job ID: A2838395 Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Description Digital and Emerging Markets Payments team is responsible for launching new payment experiences for digital businesses WW and retail business in emerging markets. We are a growing team adding new charters relevant to payment related customer experience for our customers in emerging markets. As part of this growth, we are hiring a Software Development Engineer II to contribute to the implementation of new payment methods and services to support international business regulations. In this position, you will contribute to the success of an international team that manages complex workflows, collaborates with internal and external partners, implements scalable large scale solutions, uses all the flavors of the JVM (Kotlin, Scala, Java), and leverage NAWS components to delight customers in Emerging Marketplaces. Key job responsibilities Key job responsibilities Solve complex architecture and business problems. Innovate to solve unique problems in simple yet elegant way. Solutions are extensible. Own the architecture of several components of the consumer payments tech stack Continuously working on improving the current limitations and compatibilities between subsystems, and on the development of major routines and utilities. Designing and building features with a strong mindset towards performances. Preparation of technical requirements and software design specifications. Instilling best practices for software development and documentation, making sure designs meet requirements, and delivering high quality software on tight schedules Take ownership for ensuring sanity of architecture, operational excellence and quality and insisting on highest standards while working with other software teams. Own the delivery of an integral piece of a system or application. Write high quality code that is modular, functional and testable; - Establish the best coding practices Communicate, collaborate and work effectively in a global environment unafraid to think out-of-the-box Assist directly and indirectly in the continual hiring and development of technical talent Basic Qualifications 3+ years of non-internship professional software development experience 2+ years of non-internship design or architecture (design patterns, reliability and scaling) of new and existing systems experience 3+ years of Video Games Industry (supporting title Development, Release, or Live Ops) experience Experience programming with at least one software programming language Preferred Qualifications 3+ years of full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations experience Bachelor's degree in computer science or equivalent Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - Tamil Nadu - A83 Job ID: A2969989 Show more Show less
Posted 1 week ago
10.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) – Cloud Architect As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for Managers (GTM +Cloud/ Big Data Architects) with strong technology and data understanding having proven delivery capability in delivery and pre sales. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Have proven experience in driving Analytics GTM/Pre-Sales by collaborating with senior stakeholder/s in the client and partner organization in BCM, WAM, Insurance. Activities will include pipeline building, RFP responses, creating new solutions and offerings, conducting workshops as well as managing in flight projects focused on cloud and big data. Need to work with client in converting business problems/challenges to technical solutions considering security, performance, scalability etc. [ 10- 15 years] Need to understand current & Future state enterprise architecture. Need to contribute in various technical streams during implementation of the project. Provide product and design level technical best practices Interact with senior client technology leaders, understand their business goals, create, architect, propose, develop and deliver technology solutions Define and develop client specific best practices around data management within a Hadoop environment or cloud environment Recommend design alternatives for data ingestion, processing and provisioning layers Design and develop data ingestion programs to process large data sets in Batch mode using HIVE, Pig and Sqoop, Spark Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Skills And Attributes For Success Architect in designing highly scalable solutions Azure, AWS and GCP. Strong understanding & familiarity with all Azure/AWS/GCP /Bigdata Ecosystem components Strong understanding of underlying Azure/AWS/GCP Architectural concepts and distributed computing paradigms Hands-on programming experience in Apache Spark using Python/Scala and Spark Streaming Hands on experience with major components like cloud ETLs,Spark, Databricks Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Knowledge of Spark and Kafka integration with multiple Spark jobs to consume messages from multiple Kafka partitions Solid understanding of ETL methodologies in a multi-tiered stack, integrating with Big Data systems like Cloudera and Databricks. Strong understanding of underlying Hadoop Architectural concepts and distributed computing paradigms Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Good knowledge in apache Kafka & Apache Flume Experience in Enterprise grade solution implementations. Experience in performance bench marking enterprise applications Experience in Data security [on the move, at rest] Strong UNIX operating system concepts and shell scripting knowledge To qualify for the role, you must have Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support Responsible for the evaluation of technical risks and map out mitigation strategies Experience in Data security [on the move, at rest] Experience in performance bench marking enterprise applications Working knowledge in any of the cloud platform, AWS or Azure or GCP Excellent business communication, Consulting, Quality process skills Excellent Consulting Skills Excellence in leading Solution Architecture, Design, Build and Execute for leading clients in Banking, Wealth Asset Management, or Insurance domain. Minimum 7 years hand-on experience in one or more of the above areas. Minimum 10 years industry experience Ideally, you’ll also have Strong project management skills Client management skills Solutioning skills What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 1 week ago
4.0 - 9.0 years
15 - 30 Lacs
Bangalore/Bengaluru, Mumbai (All Areas)
Hybrid
Job Title: Java Backend & Java Full Stack Developer Location: Mumbai, Bangalore Experience Range: 3 to 15 Years Notice Period: Immediate to 30 Days About the Project: We are looking for an experienced, innovative, and highly motivated Web Developer to help design and develop the next generation of Technology & Operations Management Systems Applications. Our platform supports Technology, Operations, and Finance divisions, enabling them to operate efficiently and manage over $4 billion in annual technology spend. Success in this position requires a solid foundation across the complete spectrum of web development and best practices, coupled with strong interpersonal skills. Key Responsibilities: Contribute to large-scale strategic planning and development. Design, code, test, debug, and document projects related to the technology domain, including upgrades and deployments. Review and resolve moderately complex technical challenges requiring an in-depth evaluation of technologies and procedures. Lead a team to meet existing and potential client needs, leveraging a solid understanding of function, policies, procedures, and compliance requirements. Collaborate with peers, colleagues, and mid-level managers to resolve technical challenges and achieve business goals. Provide guidance and direction to less experienced staff, acting as an escalation point. Design robust and scalable solutions to support enterprise applications. Develop APIs using REST, MQ, Kafka, and other standard channels. Must-Have Skills: Strong experience with the Java and J2EE platforms and frameworks: Java 8, Spring Boot, Spring Framework, REST, Web Services, Tomcat, JBoss. Experience in Java 8 or Java 11. For Full Stack Role: 2+ years of experience with ReactJS/Angular 8+. Experience with databases like NoSQL or MySQL (e.g., MongoDB, PostgreSQL, DB2, Sybase). In-depth knowledge of design patterns. Strong understanding of Agile methodologies and Test-Driven Development (TDD) with a track record of high-quality deliverables. Solid understanding of testing technologies, both manual and automation. Good-to-Have Skills: Familiarity with Unix scripting, performance monitoring, and load testing tools. Knowledge of Kafka, Hadoop, and Scala. Experience with frontend technologies like Angular 8 & above
Posted 1 week ago
10.0 years
0 Lacs
India
Remote
Staff Software Engineer - QE Location - India, Remote Sumo Logic is a cloud-native SaaS data analytics platform, that solves complex observability and security problems. Customers choose our product because it allows them to easily monitor, optimize, and secure their applications, systems, and infrastructures. Our microservices architecture hosted on AWS ingests petabytes of data daily across many geographic regions. Millions of queries a day analyze hundreds of petabytes of data. What can you expect to do? You will have ownership of creating and executing test plans and developing test strategies for critical system components and be responsible for analyzing test coverage, creating test cases, and coordinating review and feedback from the cross-functional team. You will help bridge the gap between development and quality assurance. You will address complex problems with innovative solutions, iterate on designs, and mentor team members to promote technical growth and excellence. Our system is a highly distributed, fault-tolerant, multi-tenant platform that includes bleeding-edge components related to storage, messaging, search, and analytics. This system ingests and analyzes terabytes of data a day, while making petabytes of data available for search and forensic analysis, and is expected to reach a substantially larger scale in the near future. Role And Responsibilities Collaborate with cross-functional teams to understand project requirements and specifications. Develop and execute test cases, scripts, plans, and procedures (manual and automated) to ensure the highest quality software delivery. Report project status defects report and verification, and issue escalation in a timely manner. Participate in design and specification reviews, providing valuable input from a testing perspective. Improve design specifications and writes elegant code that meets the Sumo Logic standard. Solves complex problems by iterating, redesigning, and innovating systems. Guide and mentor junior team members, sharing knowledge and fostering technical growth within the team. Estimate and perform risk analysis on large features during sprint planning meetings Continuously improve testing processes by staying updated on industry best practices and new technologies. Promote the adoption of innovative tools and techniques within the team. Communicate effectively with development and product teams to resolve issues and ensure timely delivery of high-quality software. Requirements Hold a Bachelor's or Master's degree in Computer Science or a related field. Possess over 10+ years of testing experience. Demonstrate a robust grasp of the software development life cycle and testing methodologies. Have hands-on experience with Enterprise-grade SaaS products. Strong problem-solving skills and a proven track record of solving complex technical challenges. Familiarity with Continuous Integration or Continuous Deployment is a valuable addition. Exhibit proficiency in object-oriented languages such as Jave, Python, Scala or GO. Work effectively with both Unix and Windows operating systems. Approach testing with a proactive "break it" mentality. Familiar with popular testing tools like TestRail, Jira, Postman, JMeter, Selenium, etc. Display enthusiasm for staying updated on cutting-edge technologies, solving complex problems, and embracing challenges. Possess the ability to comprehend the Sumo Logic backend architecture and communicate with clarity and precision, both verbally and in writing. Desirable Have hands-on experience in testing large-scale systems. Desirable experience includes working with big data and/or 24x7 commercial services. Proficiency and comfort in working with Unix, including Linux and OS X. A plus if you bring experience in Agile software development, including test-driven development and iterative and incremental development methodologies. About Us Sumo Logic, Inc. empowers the people who power modern, digital business. Sumo Logic enables customers to deliver reliable and secure cloud-native applications through its Sumo Logic SaaS Analytics Log Platform, which helps practitioners and developers ensure application reliability, secure and protect against modern security threats, and gain insights into their cloud infrastructures. Customers worldwide rely on Sumo Logic to get powerful real-time analytics and insights across observability and security solutions for their cloud-native applications. For more information, visit www.sumologic.com. Sumo Logic Privacy Policy Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
India
On-site
Currently we have an open position with our client - Its a IT Consulting Firm Principal Databricks Engineer/Architect. Key Responsibilities: 1. Databricks Solution Architecture: Design and implement scalable, secure, and efficient Databricks solutions that meet client requirements. 2. Data Engineering: Develop data pipelines, architect data lakes, and implement data warehousing solutions using Databricks. 3. Data Analytics: Collaborate with data scientists and analysts to develop and deploy machine learning models and analytics solutions on Databricks. 4. Performance Optimization: Optimize Databricks cluster performance, ensuring efficient resource utilization and cost-effectiveness. 5. Security and Governance: Implement Databricks security features, ensure data governance, and maintain compliance with industry regulations. 6. Client Engagement: Work closely with clients to understand their business requirements, provide technical guidance, and deliver high-quality Databricks solutions. 7. Thought Leadership: Stay up-to-date with the latest Databricks features, best practices, and industry trends, and share knowledge with the team. Requirements: 1. Databricks Experience: 5+ years of experience working with Databricks, including platform architecture, data engineering, and data analytics. 2. Technical Skills: Proficiency in languages such as Python, Scala, or Java, and experience with Databricks APIs, Spark, and Delta Lake. 3. Data Engineering: Strong background in data engineering, including data warehousing, ETL, and data governance. 4. Leadership: Proven experience leading technical teams, mentoring junior engineers, and driving technical initiatives. 5. Communication: Excellent communication and interpersonal skills, with the ability to work effectively with clients and internal stakeholders. Good to Have: 1. Certifications: Databricks Certified Professional or similar certifications. 2. Cloud Experience: Experience working with cloud platforms such as AWS, Azure, or GCP. 3. Machine Learning: Knowledge of machine learning concepts and experience with popular ML libraries. Show more Show less
Posted 1 week ago
10.0 years
0 Lacs
Kanayannur, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) – Cloud Architect As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for Managers (GTM +Cloud/ Big Data Architects) with strong technology and data understanding having proven delivery capability in delivery and pre sales. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Have proven experience in driving Analytics GTM/Pre-Sales by collaborating with senior stakeholder/s in the client and partner organization in BCM, WAM, Insurance. Activities will include pipeline building, RFP responses, creating new solutions and offerings, conducting workshops as well as managing in flight projects focused on cloud and big data. Need to work with client in converting business problems/challenges to technical solutions considering security, performance, scalability etc. [ 10- 15 years] Need to understand current & Future state enterprise architecture. Need to contribute in various technical streams during implementation of the project. Provide product and design level technical best practices Interact with senior client technology leaders, understand their business goals, create, architect, propose, develop and deliver technology solutions Define and develop client specific best practices around data management within a Hadoop environment or cloud environment Recommend design alternatives for data ingestion, processing and provisioning layers Design and develop data ingestion programs to process large data sets in Batch mode using HIVE, Pig and Sqoop, Spark Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Skills And Attributes For Success Architect in designing highly scalable solutions Azure, AWS and GCP. Strong understanding & familiarity with all Azure/AWS/GCP /Bigdata Ecosystem components Strong understanding of underlying Azure/AWS/GCP Architectural concepts and distributed computing paradigms Hands-on programming experience in Apache Spark using Python/Scala and Spark Streaming Hands on experience with major components like cloud ETLs,Spark, Databricks Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Knowledge of Spark and Kafka integration with multiple Spark jobs to consume messages from multiple Kafka partitions Solid understanding of ETL methodologies in a multi-tiered stack, integrating with Big Data systems like Cloudera and Databricks. Strong understanding of underlying Hadoop Architectural concepts and distributed computing paradigms Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Good knowledge in apache Kafka & Apache Flume Experience in Enterprise grade solution implementations. Experience in performance bench marking enterprise applications Experience in Data security [on the move, at rest] Strong UNIX operating system concepts and shell scripting knowledge To qualify for the role, you must have Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support Responsible for the evaluation of technical risks and map out mitigation strategies Experience in Data security [on the move, at rest] Experience in performance bench marking enterprise applications Working knowledge in any of the cloud platform, AWS or Azure or GCP Excellent business communication, Consulting, Quality process skills Excellent Consulting Skills Excellence in leading Solution Architecture, Design, Build and Execute for leading clients in Banking, Wealth Asset Management, or Insurance domain. Minimum 7 years hand-on experience in one or more of the above areas. Minimum 10 years industry experience Ideally, you’ll also have Strong project management skills Client management skills Solutioning skills What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 1 week ago
10.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) – Cloud Architect As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for Managers (GTM +Cloud/ Big Data Architects) with strong technology and data understanding having proven delivery capability in delivery and pre sales. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Have proven experience in driving Analytics GTM/Pre-Sales by collaborating with senior stakeholder/s in the client and partner organization in BCM, WAM, Insurance. Activities will include pipeline building, RFP responses, creating new solutions and offerings, conducting workshops as well as managing in flight projects focused on cloud and big data. Need to work with client in converting business problems/challenges to technical solutions considering security, performance, scalability etc. [ 10- 15 years] Need to understand current & Future state enterprise architecture. Need to contribute in various technical streams during implementation of the project. Provide product and design level technical best practices Interact with senior client technology leaders, understand their business goals, create, architect, propose, develop and deliver technology solutions Define and develop client specific best practices around data management within a Hadoop environment or cloud environment Recommend design alternatives for data ingestion, processing and provisioning layers Design and develop data ingestion programs to process large data sets in Batch mode using HIVE, Pig and Sqoop, Spark Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Skills And Attributes For Success Architect in designing highly scalable solutions Azure, AWS and GCP. Strong understanding & familiarity with all Azure/AWS/GCP /Bigdata Ecosystem components Strong understanding of underlying Azure/AWS/GCP Architectural concepts and distributed computing paradigms Hands-on programming experience in Apache Spark using Python/Scala and Spark Streaming Hands on experience with major components like cloud ETLs,Spark, Databricks Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Knowledge of Spark and Kafka integration with multiple Spark jobs to consume messages from multiple Kafka partitions Solid understanding of ETL methodologies in a multi-tiered stack, integrating with Big Data systems like Cloudera and Databricks. Strong understanding of underlying Hadoop Architectural concepts and distributed computing paradigms Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Good knowledge in apache Kafka & Apache Flume Experience in Enterprise grade solution implementations. Experience in performance bench marking enterprise applications Experience in Data security [on the move, at rest] Strong UNIX operating system concepts and shell scripting knowledge To qualify for the role, you must have Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support Responsible for the evaluation of technical risks and map out mitigation strategies Experience in Data security [on the move, at rest] Experience in performance bench marking enterprise applications Working knowledge in any of the cloud platform, AWS or Azure or GCP Excellent business communication, Consulting, Quality process skills Excellent Consulting Skills Excellence in leading Solution Architecture, Design, Build and Execute for leading clients in Banking, Wealth Asset Management, or Insurance domain. Minimum 7 years hand-on experience in one or more of the above areas. Minimum 10 years industry experience Ideally, you’ll also have Strong project management skills Client management skills Solutioning skills What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 1 week ago
6.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Gracenote is the content business unit of Nielsen that powers the world of media entertainment. Our metadata solutions help media and entertainment companies around the world deliver personalized content search and discovery, connecting audiences with the content they love. We’re at the intersection of people and media entertainment. With our cutting-edge technology and solutions, we help audiences easily find TV shows, movies, music and sports across multiple platforms. As the world leader in entertainment data and services, we power the world’s top streaming platforms, cable and satellite TV providers, media companies, consumer electronics manufacturers, music services and automakers to navigate and succeed in the competitive streaming world. Our metadata entertainment solutions have a global footprint of 80+ countries, 100K+ channels and catalogs, 70+ sports and 100M+ music tracks, all across 35 languages. Job Purpose As a senior DBA, your role is to own the databases in our data pipeline and the data governance of our Data Strategy. Our Data Strategy underpins our suite of Client-facing Applications, Data Science activities, Operational Tools and Business Analytics Responsibilities Architect and build scalable, resilient and cost-effective data storage solutions to support complex data pipelines The architecture has two facets: Storage and Compute. The DBA is responsible for designing and maintaining the different tiers of the data storage, including (but not limited to) archival, long-term persistent storage, transactional and reporting storage Design, implement and maintain various data pipelines such as self-service ingestion tools, exports to application-specific warehouses, and indexing activities The senior DBA is responsible for data modeling, as well as designing, implementing and maintaining various data catalogs, to support data transformation and product requirements Configure and deploy databases on AWS cloud, ensuring optimal performance and scalability Monitor database activities for compliance and security purposes Set up and manage backup and recovery strategies for cloud databases ensuring availability and quality Monitor database performance metrics and identify areas for optimization .Create scripts for database configuration and provisioning Collaborate with Data Science to understand, translate, and integrate methodologies into engineering build pipelines Partner with product owners to translate complex business requirements into technical solutions, imparting design and architecture guidance Provide expert mentorship to project teams on technology strategy, cultivating advanced skill sets in software engineering and modern SDLC Stay informed about the latest technologies and methodologies by participating in industry forums, having an active peer network, and engaging actively with customers Cultivate a team environment focused on continuous learning, where innovative technologies are developed and refined through teamwork Must have skills: Experience with languages such as ANSI SQL, TSQL, PL/pgSQL, PLSQL, plus database design, normalization, server tuning, and query plan optimization.6+ years of professional DBA experience with large datastores including HA and DR planning and support Software Engineering experience with programming languages such as Java, Scala, and Python Demonstrated understanding and experience with big data tools such as Kafka, Spark and Trino/PrestoExperience with orchestration tools such as Airflow Comfortable using Docker and Kubernetes for container management DevOps experience deploying and tuning the applications you’ve built Monitoring tools such as Datadog, Prometheus, Grafana, Cloudwatch Good to have: Software Engineering experience with Unix Shell Understanding of File Systems Experience configuring database replication (physical and/or logical) ETL experience (3rd party and proprietary).A personal technical blogA personal (Git) repository of side projectsParticipation in an open-source community Qualifications B.E / B.Tech / BCA/ MCA in Computer Science, Engineering or a related subject Strong Computer Science fundamentals Comfortable with version control systems such as git A thirst for learning new Tech and keeping up with industry advances Excellent communication and knowledge-sharing skills Comfortable working with technical and non-technical teams Strong debugging skills Comfortable providing and receiving code review feedback A positive attitude, adaptability, enthusiasm, and a growth mindset About Nielsen: By connecting clients to audiences, we fuel the media industry with the most accurate understanding of what people listen to and watch. To discover what audiences love, we measure across all channels and platforms—from podcasts to streaming TV to social media. And when companies and advertisers are truly connected to their audiences, they can see the most important opportunities and accelerate growth. Do you want to move the industry forward with Nielsen? Our people are the driving force. Your thoughts, ideas, and expertise can propel us forward. Whether you have fresh thinking around maximizing a new technology or you see a gap in the market, we are here to listen and act. Our team is made strong by a diversity of thoughts, experiences, skills, and backgrounds. You’ll enjoy working with smart, fun, curious colleagues, who are passionate about their work. Come be part of a team that motivates you to do your best work! Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Description Digital and Emerging Markets Payments team is responsible for launching new payment experiences for digital businesses WW and retail business in emerging markets. We are a growing team adding new charters relevant to payment related customer experience for our customers in emerging markets. As part of this growth, we are hiring a Software Development Engineer II to contribute to the implementation of new payment methods and services to support international business regulations. In this position, you will contribute to the success of an international team that manages complex workflows, collaborates with internal and external partners, implements scalable large scale solutions, uses all the flavors of the JVM (Kotlin, Scala, Java), and leverage NAWS components to delight customers in Emerging Marketplaces. Key job responsibilities Key job responsibilities Solve complex architecture and business problems. Innovate to solve unique problems in simple yet elegant way. Solutions are extensible. Own the architecture of several components of the consumer payments tech stack Continuously working on improving the current limitations and compatibilities between subsystems, and on the development of major routines and utilities. Designing and building features with a strong mindset towards performances. Preparation of technical requirements and software design specifications. Instilling best practices for software development and documentation, making sure designs meet requirements, and delivering high quality software on tight schedules Take ownership for ensuring sanity of architecture, operational excellence and quality and insisting on highest standards while working with other software teams. Own the delivery of an integral piece of a system or application. Write high quality code that is modular, functional and testable; - Establish the best coding practices Communicate, collaborate and work effectively in a global environment unafraid to think out-of-the-box Assist directly and indirectly in the continual hiring and development of technical talent Basic Qualifications 3+ years of non-internship professional software development experience 2+ years of non-internship design or architecture (design patterns, reliability and scaling) of new and existing systems experience 3+ years of Video Games Industry (supporting title Development, Release, or Live Ops) experience Experience programming with at least one software programming language Preferred Qualifications 3+ years of full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations experience Bachelor's degree in computer science or equivalent Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - Tamil Nadu - A83 Job ID: A2969696 Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Description Digital and Emerging Markets Payments team is responsible for launching new payment experiences for digital businesses WW and retail business in emerging markets. We are a growing team adding new charters relevant to payment related customer experience for our customers in emerging markets. As part of this growth, we are hiring a Software Development Engineer II to contribute to the implementation of new payment methods and services to support international business regulations. In this position, you will contribute to the success of an international team that manages complex workflows, collaborates with internal and external partners, implements scalable large scale solutions, uses all the flavors of the JVM (Kotlin, Scala, Java), and leverage NAWS components to delight customers in Emerging Marketplaces. Key job responsibilities Key job responsibilities Solve complex architecture and business problems. Innovate to solve unique problems in simple yet elegant way. Solutions are extensible. Own the architecture of several components of the consumer payments tech stack Continuously working on improving the current limitations and compatibilities between subsystems, and on the development of major routines and utilities. Designing and building features with a strong mindset towards performances. Preparation of technical requirements and software design specifications. Instilling best practices for software development and documentation, making sure designs meet requirements, and delivering high quality software on tight schedules Take ownership for ensuring sanity of architecture, operational excellence and quality and insisting on highest standards while working with other software teams. Own the delivery of an integral piece of a system or application. Write high quality code that is modular, functional and testable; - Establish the best coding practices Communicate, collaborate and work effectively in a global environment unafraid to think out-of-the-box Assist directly and indirectly in the continual hiring and development of technical talent Basic Qualifications 3+ years of non-internship professional software development experience 2+ years of non-internship design or architecture (design patterns, reliability and scaling) of new and existing systems experience 3+ years of Video Games Industry (supporting title Development, Release, or Live Ops) experience Experience programming with at least one software programming language Preferred Qualifications 3+ years of full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations experience Bachelor's degree in computer science or equivalent Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - Tamil Nadu - A83 Job ID: A2969977 Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Description Digital and Emerging Markets Payments team is responsible for launching new payment experiences for digital businesses WW and retail business in emerging markets. We are a growing team adding new charters relevant to payment related customer experience for our customers in emerging markets. As part of this growth, we are hiring a Software Development Engineer II to contribute to the implementation of new payment methods and services to support international business regulations. In this position, you will contribute to the success of an international team that manages complex workflows, collaborates with internal and external partners, implements scalable large scale solutions, uses all the flavors of the JVM (Kotlin, Scala, Java), and leverage NAWS components to delight customers in Emerging Marketplaces. Key job responsibilities Key job responsibilities Solve complex architecture and business problems. Innovate to solve unique problems in simple yet elegant way. Solutions are extensible. Own the architecture of several components of the consumer payments tech stack Continuously working on improving the current limitations and compatibilities between subsystems, and on the development of major routines and utilities. Designing and building features with a strong mindset towards performances. Preparation of technical requirements and software design specifications. Instilling best practices for software development and documentation, making sure designs meet requirements, and delivering high quality software on tight schedules Take ownership for ensuring sanity of architecture, operational excellence and quality and insisting on highest standards while working with other software teams. Own the delivery of an integral piece of a system or application. Write high quality code that is modular, functional and testable; - Establish the best coding practices Communicate, collaborate and work effectively in a global environment unafraid to think out-of-the-box Assist directly and indirectly in the continual hiring and development of technical talent Basic Qualifications 3+ years of non-internship professional software development experience 2+ years of non-internship design or architecture (design patterns, reliability and scaling) of new and existing systems experience 3+ years of Video Games Industry (supporting title Development, Release, or Live Ops) experience Experience programming with at least one software programming language Preferred Qualifications 3+ years of full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations experience Bachelor's degree in computer science or equivalent Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - Tamil Nadu - A83 Job ID: A2969990 Show more Show less
Posted 1 week ago
1.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Description Are you passionate about data? Does the prospect of dealing with massive volumes of data excite you? Do you want to build data engineering solutions that process billions of records a day in a scalable fashion using AWS technologies? Do you want to create the next-generation tools for intuitive data access? If so, Amazon Finance Technology (FinTech) is for you! FinTech is seeking a Data Engineer to join the team that is shaping the future of the finance data platform. The team is committed to building the next generation big data platform that will be one of the world's largest finance data warehouse to support Amazon's rapidly growing and dynamic businesses, and use it to deliver the BI applications which will have an immediate influence on day-to-day decision making. Amazon has culture of data-driven decision-making, and demands data that is timely, accurate, and actionable. Our platform serves Amazon's finance, tax and accounting functions across the globe. As a Data Engineer, you should be an expert with data warehousing technical components (e.g. Data Modeling, ETL and Reporting), infrastructure (e.g. hardware and software) and their integration. You should have deep understanding of the architecture for enterprise level data warehouse solutions using multiple platforms (RDBMS, Columnar, Cloud). You should be an expert in the design, creation, management, and business use of large data-sets. You should have excellent business and communication skills to be able to work with business owners to develop and define key business questions, and to build data sets that answer those questions. The candidate is expected to be able to build efficient, flexible, extensible, and scalable ETL and reporting solutions. You should be enthusiastic about learning new technologies and be able to implement solutions using them to provide new functionality to the users or to scale the existing platform. Excellent written and verbal communication skills are required as the person will work very closely with diverse teams. Having strong analytical skills is a plus. Above all, you should be passionate about working with huge data sets and someone who loves to bring data-sets together to answer business questions and drive change. Our ideal candidate thrives in a fast-paced environment, relishes working with large transactional volumes and big data, enjoys the challenge of highly complex business contexts (that are typically being defined in real-time), and, above all, is a passionate about data and analytics. In this role you will be part of a team of engineers to create world's largest financial data warehouses and BI tools for Amazon's expanding global footprint. Key job responsibilities Design, implement, and support a platform providing secured access to large datasets. Interface with tax, finance and accounting customers, gathering requirements and delivering complete BI solutions. Model data and metadata to support ad-hoc and pre-built reporting. Own the design, development, and maintenance of ongoing metrics, reports, analyses, dashboards, etc. to drive key business decisions. Recognize and adopt best practices in reporting and analysis: data integrity, test design, analysis, validation, and documentation. Tune application and query performance using profiling tools and SQL. Analyze and solve problems at their root, stepping back to understand the broader context. Learn and understand a broad range of Amazon’s data resources and know when, how, and which to use and which not to use. Keep up to date with advances in big data technologies and run pilots to design the data architecture to scale with the increased data volume using AWS. Continually improve ongoing reporting and analysis processes, automating or simplifying self-service support for datasets. Triage many possible courses of action in a high-ambiguity environment, making use of both quantitative analysis and business judgment. Basic Qualifications Experience with SQL 1+ years of data engineering experience Experience with data modeling, warehousing and building ETL pipelines Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) Experience with one or more scripting language (e.g., Python, KornShell) Preferred Qualifications Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - Karnataka Job ID: A2968106 Show more Show less
Posted 1 week ago
4.0 - 6.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
At Nielsen, we are passionate about our work to power a better media future for all people by providing powerful insights that drive client decisions and deliver extraordinary results. Our talented, global workforce is dedicated to capturing audience engagement with content - wherever and whenever it’s consumed. Together, we are proudly rooted in our deep legacy as we stand at the forefront of the media revolution. When you join Nielsen, you will join a dynamic team committed to excellence, perseverance, and the ambition to make an impact together. We champion you, because when you succeed, we do too. We enable your best to power our future. • The team this role supports is responsible for the critical function of managing lineups and metadata across various media channels such as cable, broadcast and video on demand etc. that encompasses a wide scope dealing with data from both local and national providers. • This role requires flexibility to provide technical support across different time zones, including both IST and US business hours on a rotational basis. The Support Engineer will serve as the primary point of contact for customer and stakeholder inquiries, responsible for troubleshooting issues, following Standard Operating Procedures (SOPs) and escalating to the development team when necessary. • This role requires close collaboration with cross-functional teams to ensure timely and effective issue resolution, driving operational stability and enhancing customer satisfaction. • In this role, you will debug and attempt to resolve issues independently using SOPs. If unable to resolve an issue, you will escalate it to the next level of support, involving the development team as needed. Your goal will be to ensure efficient handling of support requests and to continuously improve SOPs for recurring issues. Responsibilities:- • Serve as the first point of contact for customer or stakeholder issues, providing prompt support during the US/IST time zone on a rotational basis. Execute SOPs to troubleshoot and resolve recurring issues and ensuring adherence to documented procedures • Provide technical support and troubleshooting for cloud-based infrastructure and services, including compute, storage, networking and security components • Collaborate with application, security and other internal teams to resolve complex issues related to cloud-based services and infrastructure • Escalate unresolved issues to the development team and provide clear documentation of troubleshooting steps taken. Document and maintain up-to-date SOPs, troubleshooting guides, and technical support documentation. Collaborate with cross-functional teams to ensure issues are tracked, escalated, and resolved efficiently • Proactively identify and suggest process improvements to enhance support quality and response times Key Skills: Bachelor's or Master’s degree in Computer Science, Software Engineering, or a related field Experience Range- 4 to 6 years Must have skills: Proficiency in Java programming language Excellent SQL skills for querying and analyzing data from various database systems Good understanding of database concepts and technologies Good problem-solving skills and ability to work independently Good proficiency in AWS cloud platform and its core services Good written and verbal communication skills with a strong emphasis on technical documentation Ability to follow and create detailed SOPs for various support tasks Good to have skills: Knowledge of Scala/Python for scripting and automation Familiarity with big data technologies such as Spark and Hive Please be aware that job-seekers may be at risk of targeting by scammers seeking personal data or money. Nielsen recruiters will only contact you through official job boards, LinkedIn, or email with a nielsen.com domain. Be cautious of any outreach claiming to be from Nielsen via other messaging platforms or personal email addresses. Always verify that email communications come from an @ nielsen.com address. If you're unsure about the authenticity of a job offer or communication, please contact Nielsen directly through our official website or verified social media channels. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, protected veteran status or other characteristics protected by law. Show more Show less
Posted 1 week ago
4.0 - 6.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
At Nielsen, we are passionate about our work to power a better media future for all people by providing powerful insights that drive client decisions and deliver extraordinary results. Our talented, global workforce is dedicated to capturing audience engagement with content - wherever and whenever it’s consumed. Together, we are proudly rooted in our deep legacy as we stand at the forefront of the media revolution. When you join Nielsen, you will join a dynamic team committed to excellence, perseverance, and the ambition to make an impact together. We champion you, because when you succeed, we do too. We enable your best to power our future. • The team this role supports is responsible for the critical function of managing lineups and metadata across various media channels such as cable, broadcast and video on demand etc. that encompasses a wide scope dealing with data from both local and national providers. • This role requires flexibility to provide technical support across different time zones, including both IST and US business hours on a rotational basis. The Support Engineer will serve as the primary point of contact for customer and stakeholder inquiries, responsible for troubleshooting issues, following Standard Operating Procedures (SOPs) and escalating to the development team when necessary. • This role requires close collaboration with cross-functional teams to ensure timely and effective issue resolution, driving operational stability and enhancing customer satisfaction. • In this role, you will debug and attempt to resolve issues independently using SOPs. If unable to resolve an issue, you will escalate it to the next level of support, involving the development team as needed. Your goal will be to ensure efficient handling of support requests and to continuously improve SOPs for recurring issues. Responsibilities:- • Serve as the first point of contact for customer or stakeholder issues, providing prompt support during the US/IST time zone on a rotational basis. Execute SOPs to troubleshoot and resolve recurring issues and ensuring adherence to documented procedures • Provide technical support and troubleshooting for cloud-based infrastructure and services, including compute, storage, networking and security components • Collaborate with application, security and other internal teams to resolve complex issues related to cloud-based services and infrastructure • Escalate unresolved issues to the development team and provide clear documentation of troubleshooting steps taken. Document and maintain up-to-date SOPs, troubleshooting guides, and technical support documentation. Collaborate with cross-functional teams to ensure issues are tracked, escalated, and resolved efficiently • Proactively identify and suggest process improvements to enhance support quality and response times Key Skills: Bachelor's or Master’s degree in Computer Science, Software Engineering, or a related field Experience Range- 4 to 6 years Must have skills: Proficiency in Java programming language Excellent SQL skills for querying and analyzing data from various database systems Good understanding of database concepts and technologies Good problem-solving skills and ability to work independently Good proficiency in AWS cloud platform and its core services Good written and verbal communication skills with a strong emphasis on technical documentation Ability to follow and create detailed SOPs for various support tasks Good to have skills: Knowledge of Scala/Python for scripting and automation Familiarity with big data technologies such as Spark and Hive Please be aware that job-seekers may be at risk of targeting by scammers seeking personal data or money. Nielsen recruiters will only contact you through official job boards, LinkedIn, or email with a nielsen.com domain. Be cautious of any outreach claiming to be from Nielsen via other messaging platforms or personal email addresses. Always verify that email communications come from an @ nielsen.com address. If you're unsure about the authenticity of a job offer or communication, please contact Nielsen directly through our official website or verified social media channels. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, protected veteran status or other characteristics protected by law. Show more Show less
Posted 1 week ago
1.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Description Do you have the technical skill to build BI solutions that process billions of rows a day using AWS technologies? Do you want to create next-generation tools for intuitive data access? Do you wake up in the middle of the night with new ideas that will benefit your customers? Are you persistent in bringing your ideas to fruition? First things first, you know SQL and data modelling like the back of your hand. You also need to know Big Data and MPP systems. You have a history of coming up with innovative solutions to complex technical problems. You are a quick and willing learner of new technologies and have examples to prove your aptitude. You are not tool-centric; you determine what technology works best for the problem at hand and apply it accordingly. You can explain complex concepts to your non-technical customers in simple terms. Key job responsibilities Work with SDE teams and business stakeholders to understand data requirements and design data ingress flow for team Lead the design, model, and implementation of large, evolving, structured, semi-structured and unstructured datasets Evaluate and implement efficient distributed storage and query techniques Interact and integrate with internal and external teams and systems to extract, transform, and load data from a wide variety of sources Implement robust and maintainable code with clear and maintained documentation Implement test automation on code implemented through unit testing and integration testing Work in a tech stack which is a mix of NAWS services and legacy ETL tools within Amazon About The Team Data Insights, Metrics & Reporting team (DIMR) is the central data engineering team in Amazon Warehousing & Distribution org which is responsible for 4 things mainly - Building and maintaining data engineering and reporting infrastructure using NAWS to support internal/external data use-cases. Building data ingestions pipelines from any kind of upstream data sources which include (but not limited to) real time event streaming services, data lakes, manual file uploads, etc. Building mechanisms to vend data to internal team members or external sellers with right data handling techniques in place. Build robust data mart to support diverse use-cases powered by GenAI tool. Basic Qualifications 1+ years of data engineering experience Experience with data modeling, warehousing and building ETL pipelines Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) Experience with one or more scripting language (e.g., Python, KornShell) Preferred Qualifications Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - Karnataka Job ID: A2970459 Show more Show less
Posted 1 week ago
3.0 - 6.0 years
6 - 10 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
About KPI Partners. KPI Partners is a leading provider of data analytics solutions, dedicated to helping organizations transform data into actionable insights. Our innovative approach combines advanced technology with expert consulting, allowing businesses to leverage their data for improved performance and decision-making. Job Description. We are seeking a skilled and motivated Data Engineer with experience in Databricks to join our dynamic team. The ideal candidate will be responsible for designing, building, and maintaining scalable data pipelines and data processing solutions that support our analytics initiatives. You will collaborate closely with data scientists, analysts, and other engineers to ensure the consistent flow of high-quality data across our platforms. Key skills: Python, Pyspark, Databricks, ETL, Cloud (AWS, Azure, or GCP) Key Responsibilities. - Develop, construct, test, and maintain data architectures (e.g., large-scale data processing systems) in Databricks. - Design and implement ETL (Extract, Transform, Load) processes to move and transform data from various sources to target systems. - Collaborate with data scientists and analysts to understand data requirements and design appropriate data models and structures. - Optimize data storage and retrieval for performance and efficiency. - Monitor and troubleshoot data pipelines to ensure reliability and performance. - Engage in data quality assessments, validation, and troubleshooting of data issues. - Stay current with emerging technologies and best practices in data engineering and analytics. Qualifications. - Bachelor's degree in Computer Science, Engineering, Information Technology, or related field. - Proven experience as a Data Engineer or similar role, with hands-on experience in Databricks. - Strong proficiency in SQL and programming languages such as Python or Scala. - Experience with cloud platforms (AWS, Azure, or GCP) and related technologies. - Familiarity with data warehousing concepts and data modeling techniques. - Knowledge of data integration tools and ETL frameworks. - Strong analytical and problem-solving skills. - Excellent communication and teamwork abilities. Why Join KPI Partners? - Be part of a forward-thinking team that values innovation and collaboration. - Opportunity to work on exciting projects across diverse industries. - Continuous learning and professional development opportunities. - Competitive salary and benefits package. - Flexible work environment with hybrid work options. If you are passionate about data engineering and excited about using Databricks to drive impactful insights, we would love to hear from you! KPI Partners is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.
Posted 1 week ago
4.0 - 6.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
At Nielsen, we are passionate about our work to power a better media future for all people by providing powerful insights that drive client decisions and deliver extraordinary results. Our talented, global workforce is dedicated to capturing audience engagement with content - wherever and whenever it’s consumed. Together, we are proudly rooted in our deep legacy as we stand at the forefront of the media revolution. When you join Nielsen, you will join a dynamic team committed to excellence, perseverance, and the ambition to make an impact together. We champion you, because when you succeed, we do too. We enable your best to power our future. • The team this role supports is responsible for the critical function of managing lineups and metadata across various media channels such as cable, broadcast and video on demand etc. that encompasses a wide scope dealing with data from both local and national providers. • This role requires flexibility to provide technical support across different time zones, including both IST and US business hours on a rotational basis. The Support Engineer will serve as the primary point of contact for customer and stakeholder inquiries, responsible for troubleshooting issues, following Standard Operating Procedures (SOPs) and escalating to the development team when necessary. • This role requires close collaboration with cross-functional teams to ensure timely and effective issue resolution, driving operational stability and enhancing customer satisfaction. • In this role, you will debug and attempt to resolve issues independently using SOPs. If unable to resolve an issue, you will escalate it to the next level of support, involving the development team as needed. Your goal will be to ensure efficient handling of support requests and to continuously improve SOPs for recurring issues. Responsibilities:- • Serve as the first point of contact for customer or stakeholder issues, providing prompt support during the US/IST time zone on a rotational basis. Execute SOPs to troubleshoot and resolve recurring issues and ensuring adherence to documented procedures • Provide technical support and troubleshooting for cloud-based infrastructure and services, including compute, storage, networking and security components • Collaborate with application, security and other internal teams to resolve complex issues related to cloud-based services and infrastructure • Escalate unresolved issues to the development team and provide clear documentation of troubleshooting steps taken. Document and maintain up-to-date SOPs, troubleshooting guides, and technical support documentation. Collaborate with cross-functional teams to ensure issues are tracked, escalated, and resolved efficiently • Proactively identify and suggest process improvements to enhance support quality and response times Key Skills: Bachelor's or Master’s degree in Computer Science, Software Engineering, or a related field Experience Range- 4 to 6 years Must have skills: Proficiency in Java programming language Excellent SQL skills for querying and analyzing data from various database systems Good understanding of database concepts and technologies Good problem-solving skills and ability to work independently Good proficiency in AWS cloud platform and its core services Good written and verbal communication skills with a strong emphasis on technical documentation Ability to follow and create detailed SOPs for various support tasks Good to have skills: Knowledge of Scala/Python for scripting and automation Familiarity with big data technologies such as Spark and Hive Please be aware that job-seekers may be at risk of targeting by scammers seeking personal data or money. Nielsen recruiters will only contact you through official job boards, LinkedIn, or email with a nielsen.com domain. Be cautious of any outreach claiming to be from Nielsen via other messaging platforms or personal email addresses. Always verify that email communications come from an @ nielsen.com address. If you're unsure about the authenticity of a job offer or communication, please contact Nielsen directly through our official website or verified social media channels. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, protected veteran status or other characteristics protected by law. Show more Show less
Posted 1 week ago
6.0 - 10.0 years
12 - 20 Lacs
Hyderabad
Hybrid
AWS (EMR, S3, Glue, Airflow, RDS, Dynamodb, similar) CICD (Jenkins or another) Relational Databases experience (any) No SQL databases experience (any) Microservices or Domain services or API gateways or similar Containers (Docker, K8s, similar)
Posted 1 week ago
0 years
0 Lacs
India
On-site
Don't see exactly the role you're looking for? No problem! At Sumo Logic, we're always on the lookout for talented professionals to join our team. By submitting your application here, you are expressing interest in potential engineering roles that may become available in the future. Why Apply Now? At Sumo Logic, we believe the best teams are built before the hiring starts. If you're curious about what's next in your career—even if you're not actively job hunting—we invite you to join our talent network. Submitting your profile now means you'll be first in line when the right opportunity opens. You’ll stay on our radar for roles that match your expertise in cloud-native engineering, distributed systems, data security or observability —and we’ll reach out when the timing aligns. Let’s build what’s next together. Join the Minds Behind Modern Engineering At Sumo Logic, our mission is simple: make the digital world faster, more reliable, and secure. Our AI-powered SaaS Log Analytics Platform helps organizations turn data into real-time, actionable insights—empowering Dev, Sec, and Ops teams to solve complex problems collaboratively. By unifying enterprise data on a single platform with flexible pricing and a seamless interface, we eliminate the economic and technical barriers to ingesting, storing, and analyzing logs. This single source of truth drives smarter decisions, stronger security, and greater reliability across cloud infrastructures. As we build the future, we remain driven by curiosity and innovation—pushing the boundaries of what's possible in security and observability. Sumo Logic continues to power DevSecOps with one of the most powerful tools in modern engineering. Technologies We Use Languages: Scala, Java, TypeScript Frontend: React, Redux Streaming & Data: Kafka Streams, Elasticsearch Infrastructure: AWS, Kubernetes, Docker Areas of Engineering Focus We Regularly Hire Across a Variety Of Engineering Domains Backend Software Engineering – Resilient APIs, streaming pipelines, distributed services Frontend Engineering – Intuitive and dynamic UIs built with React/Redux ML Engineering – ML Ops, LLM Model, Python SRE / Infrastructure Engineering – Platform scalability, reliability, automation Product Management – Driving product vision, roadmap, and execution in collaboration with cross-functional teams Product Design – Crafting thoughtful, user-centric experiences that balance function, form, and usability at scale SDET - Ownership of creating and executing test plans and developing test strategies for critical system components What We Value Proficiency in Java, Scala, TypeScript, or similar Experience with cloud-native platforms and microservices architecture Knowledge of distributed systems, containerization, and DevOps best practices A strong sense of ownership and an eagerness to collaborate across teams About Us Sumo Logic, Inc., empowers the people who power modern, digital business. Sumo Logic enables customers to deliver reliable and secure cloud-native applications through its SaaS analytics platform. The Sumo Logic Continuous Intelligence Platform™ helps practitioners and developers ensure application reliability, secure, and protect against modern security threats, and gain insights into their cloud infrastructures. Customers worldwide rely on Sumo Logic to get powerful real-time analytics and insights across observability and security solutions for their cloud-native applications. For more information, visit www.sumologic.com. Sumo Logic Privacy Policy. Employees will be responsible for complying with applicable federal privacy laws and regulations, as well as organizational policies related to data protection. The expected annual base salary range is unavailable for this posting as your application will be considered for several types and levels of positions. Compensation varies based on a variety of factors which include (but aren’t limited to) such as role level, skills and competencies, qualifications, knowledge, location, and experience. In addition to base pay, certain roles are eligible to participate in our bonus or commission plans, as well as our benefits offerings, and equity awards. Show more Show less
Posted 1 week ago
1.0 years
4 - 6 Lacs
Hyderābād
On-site
- 1+ years of data engineering experience - Experience with SQL - Experience with data modeling, warehousing and building ETL pipelines - Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) - Experience with one or more scripting language (e.g., Python, KornShell) Business Data Technologies (BDT) makes it easier for teams across Amazon to produce, store, catalog, secure, move, and analyze data at massive scale. Our managed solutions combine standard AWS tooling, open-source products, and custom services to free teams from worrying about the complexities of operating at Amazon scale. This lets BDT customers move beyond the engineering and operational burden associated with managing and scaling platforms, and instead focus on scaling the value they can glean from their data, both for their customers and their teams. We own the one of the biggest (largest) data lakes for Amazon where 1000’s of Amazon teams can search, share, and store EB (Exabytes) of data in a secure and seamless way; using our solutions, teams around the world can schedule/process millions of workloads on a daily basis. We provide enterprise solutions that focus on compliance, security, integrity, and cost efficiency of operating and managing EBs of Amazon data. Key job responsibilities CORE RESPONSIBILITIES: · Be hands-on with ETL to build data pipelines to support automated reporting · Interface with other technology teams to extract, transform, and load data from a wide variety of data sources · Implement data structures using best practices in data modeling, ETL/ELT processes, and SQL, Redshift. · Model data and metadata for ad-hoc and pre-built reporting · Interface with business customers, gathering requirements and delivering complete reporting solutions · Build robust and scalable data integration (ETL) pipelines using SQL, Python and Spark. · Build and deliver high quality data sets to support business analyst, data scientists, and customer reporting needs. · Continually improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers · Participate in strategic & tactical planning discussions A day in the life As a Data Engineer, you will be working with cross-functional partners from Science, Product, SDEs, Operations and leadership to translate raw data into actionable insights for stakeholders, empowering them to make data-driven decisions. Some of the key activities include: Crafting the Data Flow: Design and build data pipelines, the backbone of our data ecosystem. Ensure the integrity of the data journey by implementing robust data quality checks and monitoring processes. Architect for Insights: Translate complex business requirements into efficient data models that optimize data analysis and reporting. Automate data processing tasks to streamline workflows and improve efficiency. Become a data detective! ensuring data availability and performance Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc. Knowledge of cloud services such as AWS or equivalent Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.
Posted 1 week ago
4.0 years
2 - 6 Lacs
Hyderābād
On-site
About this role: Wells Fargo is seeking a Senior Software Engineer In this role, you will: Lead or participate in complex initiatives on selected domains Assure quality, security and compliance for supported systems and applications Serve as a technical resource in finding software solutions Review and evaluate user needs and determine requirements Provide technical support, advice, and consultation with the issues relating to supported applications Create test data and conduct interfaces and unit tests Design, code, test, debug and document programs using Agile development practices Understand and participate to ensure compliance and risk management requirements for supported area are met and work with other stakeholders to implement key risk initiatives Conduct research and resolve problems in relation to processes and recommend solutions and process improvements Assist other individuals in advanced software development Collaborate and consult with peers, colleagues and managers to resolve issues and achieve goals Required Qualifications: 4+ years of Specialty Software Engineering experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Desired Qualifications: Python, Scala, SQL, PL/SQL, C#, Shell Script Proficient in technologies such as Kafka, Real time data processing Spark/Splunk/Cassandra, BDD, and Business Intelligence tool Experience in Cloud GCP/Azure, OpenShift etc. Job Expectations: Requirements gathering, analysis, design, development, and implementation of end-to-end solutions using Agile methodologies Posting End Date: 14 Jun 2025 *Job posting may come down early due to volume of applicants. We Value Equal Opportunity Wells Fargo is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other legally protected characteristic. Employees support our focus on building strong customer relationships balanced with a strong risk mitigating and compliance-driven culture which firmly establishes those disciplines as critical to the success of our customers and company. They are accountable for execution of all applicable risk programs (Credit, Market, Financial Crimes, Operational, Regulatory Compliance), which includes effectively following and adhering to applicable Wells Fargo policies and procedures, appropriately fulfilling risk and compliance obligations, timely and effective escalation and remediation of issues, and making sound risk decisions. There is emphasis on proactive monitoring, governance, risk identification and escalation, as well as making sound risk decisions commensurate with the business unit's risk appetite and all risk and compliance program requirements. Candidates applying to job openings posted in Canada: Applications for employment are encouraged from all qualified candidates, including women, persons with disabilities, aboriginal peoples and visible minorities. Accommodation for applicants with disabilities is available upon request in connection with the recruitment process. Applicants with Disabilities To request a medical accommodation during the application or interview process, visit Disability Inclusion at Wells Fargo . Drug and Alcohol Policy Wells Fargo maintains a drug free workplace. Please see our Drug and Alcohol Policy to learn more. Wells Fargo Recruitment and Hiring Requirements: a. Third-Party recordings are prohibited unless authorized by Wells Fargo. b. Wells Fargo requires you to directly represent your own experiences during the recruiting and hiring process.
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Scala is a popular programming language that is widely used in India, especially in the tech industry. Job seekers looking for opportunities in Scala can find a variety of roles across different cities in the country. In this article, we will dive into the Scala job market in India and provide valuable insights for job seekers.
These cities are known for their thriving tech ecosystem and have a high demand for Scala professionals.
The salary range for Scala professionals in India varies based on experience levels. Entry-level Scala developers can expect to earn around INR 6-8 lakhs per annum, while experienced professionals with 5+ years of experience can earn upwards of INR 15 lakhs per annum.
In the Scala job market, a typical career path may look like: - Junior Developer - Scala Developer - Senior Developer - Tech Lead
As professionals gain more experience and expertise in Scala, they can progress to higher roles with increased responsibilities.
In addition to Scala expertise, employers often look for candidates with the following skills: - Java - Spark - Akka - Play Framework - Functional programming concepts
Having a good understanding of these related skills can enhance a candidate's profile and increase their chances of landing a Scala job.
Here are 25 interview questions that you may encounter when applying for Scala roles:
As you explore Scala jobs in India, remember to showcase your expertise in Scala and related skills during interviews. Prepare well, stay confident, and you'll be on your way to a successful career in Scala. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.