Jobs
Interviews

6074 Scala Jobs - Page 9

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

15.0 years

0 Lacs

India

On-site

Job Summary As part of the data leadership team, the Capability Lead – Databricks will be responsible for building, scaling, and delivering Databricks-based data and AI capabilities across the organization. This leadership role involves technical vision, solution architecture, team building, partnership development, and delivery excellence using Databricks Unified Analytics Platform across industries. The individual will collaborate with clients, alliance partners (Databricks, Azure, AWS), internal stakeholders, and sales teams to drive adoption of lakehouse architectures, data engineering best practices, and AI/ML modernization. Areas of Responsibility 1. Offering and Capability Development: Develop and enhance Snowflake-based data platform offerings and accelerators Define best practices, architectural standards, and reusable frameworks for Snowflake Collaborate with alliance teams to strengthen partnership with Snowflake 2. Technical Leadership: Provide architectural guidance for Snowflake solution design and implementation Lead solutioning efforts for proposals, RFIs, and RFPs involving Snowflake Conduct technical reviews and ensure adherence to design standards. Act as a technical escalation point for complex project challenges 3. Delivery Oversight: Support delivery teams with technical expertise across Snowflake projects Drive quality assurance, performance optimization, and project risk mitigation. Review project artifacts and ensure alignment with Snowflake best practices Foster a culture of continuous improvement and delivery excellence 4. Talent Development: Build and grow a high-performing Snowflake capability team. Define skill development pathways and certification goals for team members. Mentor architects, developers, and consultants on Snowflake technologies Drive community of practice initiatives to share knowledge and innovations 5. Business Development Support: Engage with sales and pre-sales teams to position Snowflake capabilities Contribute to account growth by identifying new Snowflake opportunities Participate in client presentations, workshops, and technical discussions 6. Thought Leadership and Innovation Build thought leadership through whitepapers, blogs, and webinars Stay updated with Snowflake product enhancements and industry trends This role is highly collaborative and will work extremely closely with cross functional teams to fulfill the above responsibilities. Job Requirements: 12–15 years of experience in data engineering, analytics, and AI/ML 3–5 years of strong hands-on experience with Databricks (on Azure, AWS, or GCP) Expertise in Spark (PySpark/Scala), Delta Lake, Unity Catalog, MLflow, and Databricks notebooks Experience designing and implementing Lakehouse architectures at scale Familiarity with data governance, security, and compliance frameworks (GDPR, HIPAA, etc.) Experience with real-time and batch data pipelines (Structured Streaming, Auto Loader, Kafka, etc.) Strong understanding of MLOps and AI/ML lifecycle management Certifications in Databricks (e.g., Databricks Certified Data Engineer Professional, ML Engineer Associate) are preferred Experience with hyperscaler ecosystems (Azure Data Lake, AWS S3, GCP GCS, ADF, Glue, etc.) Experience managing large, distributed teams and working with CXO-level stakeholders Strong problem-solving, analytical, and decision-making skills Excellent verbal, written, and client-facing communication

Posted 3 days ago

Apply

4.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Notice Period : 30 days Interview and Relocation Details Interview Process: Final interview rounds will be conducted in-person in Chennai only . Work Location: This position is based in Bangalore. Relocation: While the final interviews are in Chennai, candidates, including those currently residing in Chennai, must be willing to relocate to Bangalore within 3-6 months of their start date, or as required by business needs. Important Note for Applicants Please apply only if you are currently based in or around Chennai and are able to attend an in-person interview in Chennai. Additionally, all selected candidates must be willing to relocate to Bangalore within 3-6 months of their start date, or as required. What you will do ● Create beautiful software experiences for our clients using design thinking, lean, and agile methodology. ● Work on software products designed from scratch using the latest cutting-edge technologies, platforms, and languages such as JAVA, Python, JavaScript, GoLang, and Scala. ● Work in a dynamic, collaborative, transparent, non-hierarchical culture. ● Work in collaborative, fast-paced, and value-driven teams to build innovative customer experiences for our clients. ● Help to grow the next generation of developers and have a positive impact on the industry. Basic Qualifications ● Experience: 4+ years. ● Hands-on development experience with a broad mix of languages such as JAVA, Python, JavaScript, etc. ● Server-side development experience mainly in JAVA, (Python and NodeJS can be considerable) ● UI development experience in ReactJS or AngularJS or PolymerJS or EmberJS, or jQuery, etc., is good to have. ● Passion for software engineering and following the best coding concepts. ● Good to great problem solving and communication skills. Nice to have Qualifications ● Product and customer-centric mindset. ● Great OO skills, including design patterns. ● Experience with devops, continuous integration & deployment. ● Exposure to big data technologies, Machine Learning and NLP will be a plus. Benefits ● Competitive salary. ● Work from anywhere. ● Learning and gaining experience rapidly. ● Reimbursement for basic working setup at home. Location Bengaluru - Hybrid / WFO

Posted 3 days ago

Apply

0 years

0 Lacs

India

On-site

Job Summary: We are looking for a skilled Senior Data Engineer with strong expertise in Spark and Scala on the AWS platform. The ideal candidate should possess excellent problem-solving skills and hands-on experience in Spark-based data processing within a cloud-based ecosystem. This role offers the opportunity to independently execute diverse and complex engineering tasks, demonstrate a solid understanding of the end-to-end software development lifecycle, and collaborate effectively with stakeholders to deliver high-quality technical solutions. Key Responsibilities: Develop, analyze, debug, and enhance Spark-Scala programs. Work on Spark batch processing jobs, with the ability to analyze/debug using Spark UI and logs. Optimize performance of Spark applications and ensure scalability and reliability. Manage data processing tasks using AWS S3, AWS EMR clusters, and other AWS services. Leverage Hadoop ecosystem tools including HDFS, HBase, Hive, and MapReduce. Write efficient and optimized SQL queries; experience with PostgreSQL and Couchbase or similar databases is preferred. Utilize orchestration tools such as Kafka, NiFi, and Oozie. Work with monitoring tools like Dynatrace and CloudWatch. Contribute to the creation of High-Level Design (HLD) and Low-Level Design (LLD) documents and participate in reviews with architects. Support development and lower environments setup, including local IDE configuration. Follow defined coding standards, best practices, and quality processes. Collaborate using Agile methodologies for development, review, and delivery. Use supplementary programming languages like Python as needed. Required Skills: Mandatory: Apache Spark Scala Big Data Hadoop Ecosystem Spark SQL Additional Preferred Skills: Spring Core Framework Core Java, Hibernate, Multithreading AWS EMR, S3, CloudWatch HDFS, HBase, Hive, MapReduce PostgreSQL, Couchbase Kafka, NiFi, Oozie Dynatrace or other monitoring tools Python (as supplementary language) Agile Methodology

Posted 3 days ago

Apply

5.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Expedia Group brands power global travel for everyone, everywhere. We design cutting-edge tech to make travel smoother and more memorable, and we create groundbreaking solutions for our partners. Our diverse, vibrant, and welcoming community is essential in driving our success. Why Join Us? To shape the future of travel, people must come first. Guided by our Values and Leadership Agreements, we foster an open culture where everyone belongs, differences are celebrated and know that when one of us wins, we all win. We provide a full benefits package, including exciting travel perks, generous time-off, parental leave, a flexible work model (with some pretty cool offices), and career development resources, all to fuel our employees' passion for travel and ensure a rewarding career journey. We’re building a more open world. Join us. Introduction to team Our Expedia Product & Technology division builds innovative products, services, and tools to deliver high-quality experiences for travellers, partners, and our employees. A unified, singular technology platform powered by data and machine learning provides secure, differentiated, and personalised experiences for the traveler and our partners that drive loyalty and customer satisfaction. The goal of Media Solutions(MeSo) Tech team is to spearhead advertising products and services across many Expedia brands including BEX, Hotels.com, Portfolio Brands (COMET & Hotwire.com), Expedia Partner Sites (EPS), and Vrbo. We help our advertisers identify travellers on EG sites, target specific traveller criteria, and then deliver the most relevant products. As a Senior Software Development Engineer, you will propose, design and implement various initiatives. As a member of the team, you will work in alliance with global teams providing the technical expertise needed to overcome hard problems. We value rigor and innovative thinking in our development process and believe in the power of a motivated & agile development team. What You’ll Do Join a high-performing team and have a unique opportunity to make a highly visible impact Learn best practices and how to constantly raise the bar in terms of engineering excellence Identify inefficiencies in code or systems operation and offer suggestions for improvements Expand your skills in developing high quality, distributed and scalable software Share new skills and knowledge with the team to increase efficiency Write code that is clean, maintainable, and optimized with good naming conventions Develop fast, scalable, and highly available services Participate in code reviews and pull requests Who You Are 5+ years software development work experience using modern languages i.e. Java/Kotlin Bachelor’s in computer science or related technical field; or equivalent related professional experience Experience in Java, Scala, Kotlin, AWS, Kafka, S3, Lambda, Docker, Datadog Problem solver with a good understanding of algorithms, data structures, and distributed applications Solid understanding of Object-Oriented Programming concepts, data structure, algorithms, and test-driven development Solid understanding of load balancing, caching, database partitioning, caching to improve application scalability Demonstrated ability to develop and support large-sized internet-scale software systems Experience in AWS Service Knowledge of No-SQL databases and cloud computing concepts Sound understanding of client-side optimization best practices Ability to quickly pick up new technologies, languages with ease Working knowledge of Agile Software Development methodologies Strong verbal and written communication skills Passionate about quality of work Accommodation requests If you need assistance with any part of the application or recruiting process due to a disability, or other physical or mental health conditions, please reach out to our Recruiting Accommodations Team through the Accommodation Request. We are proud to be named as a Best Place to Work on Glassdoor in 2024 and be recognized for award-winning culture by organizations like Forbes, TIME, Disability:IN, and others. Expedia Group's family of brands includes: Brand Expedia®, Hotels.com®, Expedia® Partner Solutions, Vrbo®, trivago®, Orbitz®, Travelocity®, Hotwire®, Wotif®, ebookers®, CheapTickets®, Expedia Group™ Media Solutions, Expedia Local Expert®, CarRentals.com™, and Expedia Cruises™. © 2024 Expedia, Inc. All rights reserved. Trademarks and logos are the property of their respective owners. CST: 2029030-50 Employment opportunities and job offers at Expedia Group will always come from Expedia Group’s Talent Acquisition and hiring teams. Never provide sensitive, personal information to someone unless you’re confident who the recipient is. Expedia Group does not extend job offers via email or any other messaging tools to individuals with whom we have not made prior contact. Our email domain is @expediagroup.com. The official website to find and apply for job openings at Expedia Group is careers.expediagroup.com/jobs. Expedia is committed to creating an inclusive work environment with a diverse workforce. All qualified applicants will receive consideration for employment without regard to race, religion, gender, sexual orientation, national origin, disability or age.

Posted 3 days ago

Apply

7.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Position Summary: This role is accountable for running day-to-day operations of the Data Platform in Azure / AWS Databricks.The role involves designing and implementing data ingestion pipelines from multiple sources using Azure Databricks, ensuring seamless and efficient pipeline executions, and adhering to security, regulatory, and audit control guidelines. Key Responsibilities: ● Design and implement data ingestion pipelines from multiple sources using Azure Databricks. ● Ensure data pipelines run smoothly and efficiently with minimal downtime. ● Develop scalable and reusable frameworks for ingesting large and complex datasets. ● Integrate end-to-end data pipelines, ensuring quality and consistency from source systems to target repositories. ● Work with event-based and streaming technologies to ingest and process data in real-time. ● Collaborate with other project team members to deliver additional components such as API interfaces and search functionalities. ● Evaluate the performance and applicability of various tools against customer requirements and provide recommendations. ● Provide technical advice to the team and assist in issue resolution, leveraging strong Cloud and Databricks knowledge. ● Provide on-call, after-hours, and weekend support as needed to maintain platform stability. ● Fulfil service requests related to the Data Analytics platform efficiently. ● Lead and drive optimisation and continuous improvement initiatives within the team. ● Conduct technical reviews of changes as part of release management, acting as a gatekeeper for production deployments. ● Adhere to data security standards and implement required controls within the platform. ● Lead the design, development, and deployment of advanced data pipelines and analytical workflows on the Databricks Lakehouse platform. ● Collaborate with data scientists, engineers, and business stakeholders to build and scale end-to-end data solutions. ● Own architectural decisions to ensure alignment with data governance, security, and compliance requirements. ● Mentor and guide a team of data engineers, providing technical leadership and supporting career development. ● Implement CI/CD practices for data engineering pipelines using tools like Azure DevOps, GitHub Actions, or Jenkins. Qualifications and Experience: ● Bachelor’s degree in IT, Computer Science, Software Engineering, Business Analytics, or equivalent ● Minimum of 7+ years of experience in the data analytics field. ● Proven experience with Azure/AWS Databricks in building and optimising data pipelines, ● architectures, and datasets. ● Strong expertise in Scala or Python, PySpark, and SQL for data engineering tasks. ● Ability to troubleshoot and optimize complex queries on the Spark platform. ● Knowledge of structured and unstructured data design, modelling, access, and storage techniques. ● Experience designing and deploying data applications on cloud platforms such as Azure or AWS. ● Hands-on experience in performance tuning and optimising code running in Databricks environment ● Strong analytical and problem-solving skills, particularly within Big Data environments. ● Experience with Big Data management tools and technologies including Cloudera, Python, Hive, Scala, Data Warehouse, Data Lake, AWS, Azure. Technical and Professional Skills: Must Have: ● Excellent communication skills with the ability to interact directly with customers. ● Azure/AWS Databricks. ● Python / Scala / Spark / PySpark. ● Strong SQL and RDBMS expertise. ● HIVE / HBase / Impala / Parquet. ● Sqoop, Kafka, Flume. ● Airflow. ● Jenkins or Bamboo. ● Github or Bitbucket. ● Nexus. Good to Have: ● Relevant accredited certifications for Azure, AWS, Cloud Engineering, and/or Databricks. ● Knowledge of Delta Live Tables (DLT).

Posted 3 days ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

We are seeking a highly skilled and motivated Python , AWS, Big Data Engineer to join our data engineering team. The ideal candidate will have hands-on experience with Hadoop ecosystem, Apache Spark, and programming expertise in Python (PySpark), Scala, and Java. You will be responsible for designing, developing, and optimizing scalable data pipelines and big data solutions to support analytics and business intelligence initiatives.

Posted 4 days ago

Apply

10.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

At EY, we’re all in to shape your future with confidence. We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. EY GDS – Data and Analytics (D&A) – Cloud Architect - Manager As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for SeniorManagers (GTM +Cloud/ Big Data Architects) with strong technology and data understanding having proven delivery capability in delivery and pre sales. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Have proven experience in driving Analytics GTM/Pre-Sales by collaborating with senior stakeholder/s in the client and partner organization in BCM, WAM, Insurance. Activities will include pipeline building, RFP responses, creating new solutions and offerings, conducting workshops as well as managing in flight projects focused on cloud and big data. Need to work with client in converting business problems/challenges to technical solutions considering security, performance, scalability etc. [ 10- 15 years] Need to understand current & Future state enterprise architecture. Need to contribute in various technical streams during implementation of the project. Provide product and design level technical best practices Interact with senior client technology leaders, understand their business goals, create, architect, propose, develop and deliver technology solutions Define and develop client specific best practices around data management within a Hadoop environment or cloud environment Recommend design alternatives for data ingestion, processing and provisioning layers Design and develop data ingestion programs to process large data sets in Batch mode using HIVE, Pig and Sqoop, Spark Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Skills And Attributes For Success Architect in designing highly scalable solutions Azure, AWS and GCP. Strong understanding & familiarity with all Azure/AWS/GCP /Bigdata Ecosystem components Strong understanding of underlying Azure/AWS/GCP Architectural concepts and distributed computing paradigms Hands-on programming experience in Apache Spark using Python/Scala and Spark Streaming Hands on experience with major components like cloud ETLs,Spark, Databricks Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Knowledge of Spark and Kafka integration with multiple Spark jobs to consume messages from multiple Kafka partitions Solid understanding of ETL methodologies in a multi-tiered stack, integrating with Big Data systems like Cloudera and Databricks. Strong understanding of underlying Hadoop Architectural concepts and distributed computing paradigms Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Good knowledge in apache Kafka & Apache Flume Experience in Enterprise grade solution implementations. Experience in performance bench marking enterprise applications Experience in Data security [on the move, at rest] Strong UNIX operating system concepts and shell scripting knowledge To qualify for the role, you must have Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support Responsible for the evaluation of technical risks and map out mitigation strategies Experience in Data security [on the move, at rest] Experience in performance bench marking enterprise applications Working knowledge in any of the cloud platform, AWS or Azure or GCP Excellent business communication, Consulting, Quality process skills Excellent Consulting Skills Excellence in leading Solution Architecture, Design, Build and Execute for leading clients in Banking, Wealth Asset Management, or Insurance domain. Minimum 7 years hand-on experience in one or more of the above areas. Minimum 10 years industry experience Ideally, you’ll also have Strong project management skills Client management skills Solutioning skills What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets. Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow. EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.

Posted 4 days ago

Apply

0 years

0 - 1 Lacs

Thiruvananthapuram

On-site

Data Science and AI Developer **Job Description:** We are seeking a highly skilled and motivated Data Science and AI Developer to join our dynamic team. As a Data Science and AI Developer, you will be responsible for leveraging cutting-edge technologies to develop innovative solutions that drive business insights and enhance decision-making processes. **Key Responsibilities:** 1. Develop and deploy machine learning models for predictive analytics, classification, clustering, and anomaly detection. 2. Design and implement algorithms for data mining, pattern recognition, and natural language processing. 3. Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions. 4. Utilize advanced statistical techniques to analyze complex datasets and extract actionable insights. 5. Implement scalable data pipelines for data ingestion, preprocessing, feature engineering, and model training. 6. Stay updated with the latest advancements in data science, machine learning, and artificial intelligence research. 7. Optimize model performance and scalability through experimentation and iteration. 8. Communicate findings and results to stakeholders through reports, presentations, and visualizations. 9. Ensure compliance with data privacy regulations and best practices in data handling and security. 10. Mentor junior team members and provide technical guidance and support. **Requirements:** 1. Bachelor’s or Master’s degree in Computer Science, Data Science, Statistics, or a related field. 2. Proven experience in developing and deploying machine learning models in production environments. 3. Proficiency in programming languages such as Python, R, or Scala, with strong software engineering skills. 4. Hands-on experience with machine learning libraries/frameworks such as TensorFlow, PyTorch, Scikit-learn, or Spark MLlib. 5. Solid understanding of data structures, algorithms, and computer science fundamentals. 6. Excellent problem-solving skills and the ability to think creatively to overcome challenges. 7. Strong communication and interpersonal skills, with the ability to work effectively in a collaborative team environment. 8. Certification in Data Science, Machine Learning, or Artificial Intelligence (e.g., Coursera, edX, Udacity, etc.). 9. Experience with cloud platforms such as AWS, Azure, or Google Cloud is a plus. 10. Familiarity with big data technologies (e.g., Hadoop, Spark, Kafka) is an advantage. Data Manipulation and Analysis : NumPy, Pandas Data Visualization : Matplotlib, Seaborn, Power BI Machine Learning Libraries : Scikit-learn, TensorFlow, Keras Statistical Analysis : SciPy Web Scrapping : Scrapy IDE : PyCharm, Google Colab HTML/CSS/JavaScript/React JS Proficiency in these core web development technologies is a must. Python Django Expertise: In-depth knowledge of e-commerce functionalities or deep Python Django knowledge. Theming: Proven experience in designing and implementing custom themes for Python websites. Responsive Design: Strong understanding of responsive design principles and the ability to create visually appealing and user-friendly interfaces for various devices. Problem Solving: Excellent problem-solving skills with the ability to troubleshoot and resolve issues independently. Collaboration: Ability to work closely with cross-functional teams, including marketing and design, to bring creative visions to life. interns must know about how to connect front end with datascience Also must Know to connect datascience to frontend **Benefits:** - Competitive salary package - Flexible working hours - Opportunities for career growth and professional development - Dynamic and innovative work environment Job Type: Full-time Pay: ₹8,000.00 - ₹12,000.00 per month Ability to commute/relocate: Thiruvananthapuram, Kerala: Reliably commute or planning to relocate before starting work (Preferred) Work Location: In person

Posted 4 days ago

Apply

12.0 years

6 - 8 Lacs

Hyderābād

On-site

Welcome to Warner Bros. Discovery… the stuff dreams are made of. Who We Are… When we say, “the stuff dreams are made of,” we’re not just referring to the world of wizards, dragons and superheroes, or even to the wonders of Planet Earth. Behind WBD’s vast portfolio of iconic content and beloved brands, are the storytellers bringing our characters to life, the creators bringing them to your living rooms and the dreamers creating what’s next… From brilliant creatives, to technology trailblazers, across the globe, WBD offers career defining opportunities, thoughtfully curated benefits, and the tools to explore and grow into your best selves. Here you are supported, here you are celebrated, here you can thrive. Your New Role : As the Director ML Engineering & Automation, you will lead a dynamic team at the forefront of transforming the Media & Entertainment industry. In this high-impact role, you will be responsible for driving the strategic direction and execution of ML and AI solutions to enhance content personalization, automate workflows, and provide innovative, data-driven experiences. You will work closely with cross-functional teams, including Data Scientists, Software Engineers, and Product Managers, to deliver cutting-edge solutions that fuel business growth and deliver the next generation of entertainment experiences. This is an exciting opportunity to shape the future of entertainment through AI-driven innovation. In this leadership role, you will have the opportunity to build and scale a world-class ML engineering team, while leveraging your expertise to solve some of the most complex and exciting challenges in the industry. This role combines technical excellence, strategic vision, and team leadership to make a lasting impact on the company's mission. Your Role Accountabilities: 1. Leadership & Team Building Lead, mentor, and grow a high-performing ML engineering team, fostering a culture of innovation and continuous improvement. Define and execute the roadmap for building scalable, high-impact ML solutions that support the company's core business and strategic objectives. Collaborate closely with leadership across departments, including data science, product management, and IT, to ensure alignment of ML initiatives with business needs. Establish clear performance metrics and regularly assess the effectiveness of the team and individual contributors. Promote knowledge-sharing, best practices, and a growth mindset within the team to enhance technical depth and execution efficiency. 2. End-to-End ML Solution Development Oversee the design, development, deployment, and maintenance of machine learning models and systems that power content recommendation engines, personalization, automation, and other key business areas. Drive the adoption of best practices in model development, testing, monitoring, and optimization across the team. Build scalable, production-ready ML pipelines that process vast amounts of data and generate real-time insights. Ensure that solutions are optimized for performance, cost-efficiency, and maintainability in a cloud-native, microservices environment. Lead efforts to continuously improve model performance, incorporating user feedback and business metrics. 3. Innovation & Strategy Identify and evaluate emerging technologies, algorithms, and methodologies in the AI/ML space, integrating them into the company's tech stack to maintain a competitive edge. Work closely with senior leadership to define AI/ML strategies and influence decision-making on product and business development. Evaluate and implement advanced techniques in natural language processing (NLP), computer vision, deep learning, reinforcement learning, and other domains as relevant to the media industry. Provide thought leadership on the application of AI/ML in media and entertainment, positioning the company as a leader in AI-driven innovation. 4. Collaboration & Cross-Functional Engagement Partner with product management, engineering, and business teams to translate complex business problems into technical ML solutions. Collaborate on the integration of ML models into products and workflows, ensuring smooth end-to-end delivery from prototype to production. Act as a trusted advisor to executives and stakeholders on ML capabilities, project status, risks, and business impact. Drive the development and implementation of data governance, privacy, and security practices to ensure compliance with regulatory requirements. Facilitate the sharing of ML insights with broader company teams, providing transparency and fostering a data-driven culture. 5. Performance Monitoring & Reporting Define and track key performance indicators (KPIs) to measure the success of ML initiatives and models. Oversee the collection and analysis of model performance data, providing regular updates to leadership and stakeholders. Ensure that deployed models are continuously monitored, maintained, and updated to meet evolving business needs. Lead post-mortem analyses of model failures and actively drive improvements based on lessons learned. Utilize data to iterate and refine models to increase their accuracy and efficiency. Qualifications & Experiences: Master’s or Ph.D. degree in Computer Science, Engineering, Data Science, Machine Learning, or a related field from a reputed institution. 12+ years of experience in the field of machine learning and AI, with at least 5 years in a leadership or managerial role. Proven track record of successfully leading and scaling ML engineering teams and delivering large-scale ML projects in a fast-paced environment. Experience working in the Media & Entertainment industry or related sectors, with knowledge of data-driven content recommendations, personalization, and automation. Expertise in designing, building, and deploying production-grade machine learning systems at scale. Experience in leading cross-functional teams to deliver end-to-end machine learning solutions, from conceptualization to deployment and optimization. Demonstrated ability to influence senior stakeholders and executives, translating technical concepts into business impact. Strong expertise in machine learning algorithms, deep learning, reinforcement learning, and statistical modeling techniques. In-depth knowledge of data structures, software engineering principles, and system design. Experience with distributed computing and cloud technologies (AWS, GCP, Azure) and containerization (Docker, Kubernetes). Proficiency in programming languages such as Python, Java, or Scala, and familiarity with ML frameworks like TensorFlow, PyTorch, or Keras. How We Get Things Done… This last bit is probably the most important! Here at WBD, our guiding principles are the core values by which we operate and are central to how we get things done. You can find them at www.wbd.com/guiding-principles/ along with some insights from the team on what they mean and how they show up in their day to day. We hope they resonate with you and look forward to discussing them during your interview. Championing Inclusion at WBD Warner Bros. Discovery embraces the opportunity to build a workforce that reflects a wide array of perspectives, backgrounds and experiences. Being an equal opportunity employer means that we take seriously our responsibility to consider qualified candidates on the basis of merit, regardless of sex, gender identity, ethnicity, age, sexual orientation, religion or belief, marital status, pregnancy, parenthood, disability or any other category protected by law. If you’re a qualified candidate with a disability and you require adjustments or accommodations during the job application and/or recruitment process, please visit our accessibility page for instructions to submit your request.

Posted 4 days ago

Apply

2.0 years

8 - 9 Lacs

Hyderābād

On-site

JOB DESCRIPTION We have an exciting and rewarding opportunity for you to take your software engineering career to the next level. As a Software Engineer II at JPMorgan Chase within the Consumer and community banking- Data technology, you serve as a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm’s business objectives. Job responsibilities Oversees for all aspects of data strategy, governance, data risk management, reporting and analytics. Work with product owners, data owners and customers to evaluate data requirements and identify right technology solutions and implementation. Design, develop, code, test, debug and deploy applications for scalable and extensible applications. Produce high quality code utilizing Test Driven Development techniques. Participate in retrospectives to drive continuous improvement within the feature team. Participating in code reviews and ensuring that all solutions are aligned to pre-defined architectural specifications. Implement Automation: Continuous Integration and Continuous Delivery. Manage Cloud Development and Deployment: Support development and deployment of applications into AWS Public clouds. Required qualifications, capabilities, and skills. Formal training or certification on software engineering concepts and 2+ years applied experience. Advanced knowledge of architecture, design and business processes Full Software Development Life Cycle experience within an Agile framework Experience with Java, AWS, Database technologies, Python, Scala, Spark and Snowflake. Experience with the development and decomposition of complex SQL (RDMS Platforms) & Experience leading cloud providers (AWS/Azure/GCP) Experience with Data Warehousing concepts (including Star Schema). Practical experience in delivering projects in Data and Analytics, Big Data, Data Warehousing, Business Intelligence. Familiar with relevant technological solutions and industry best practices. Good understanding of data engineering challenges and proven experience with data platform engineering (batch and streaming, ingestion, storage, processing, management, integration, consumption). Aware of various Data & Analytics tools and techniques (e.g. Python, data mining, predictive analytics, machine learning, data modelling, etc.) Preferred qualifications, capabilities, and skills Ability to work fast and quickly ramp up on new technologies and strategies. Ability to work collaboratively in teams and develop meaningful relationships to achieve common goals. Appreciation of Controls and Compliance processes for applications and data. In depth understanding of data technologies and solutions is preferable. Drive Process improvements and implement process changes as necessary. Knowledge of industry-wide Big Data technology trends and best practices ABOUT US

Posted 4 days ago

Apply

4.0 years

10 Lacs

Gurgaon

On-site

Expedia Group brands power global travel for everyone, everywhere. We design cutting-edge tech to make travel smoother and more memorable, and we create groundbreaking solutions for our partners. Our diverse, vibrant, and welcoming community is essential in driving our success. Why Join Us? To shape the future of travel, people must come first. Guided by our Values and Leadership Agreements, we foster an open culture where everyone belongs, differences are celebrated and know that when one of us wins, we all win. We provide a full benefits package, including exciting travel perks, generous time-off, parental leave, a flexible work model (with some pretty cool offices), and career development resources, all to fuel our employees' passion for travel and ensure a rewarding career journey. We’re building a more open world. Join us. Introduction to team : Are you fascinated by data and building robust data pipelines which process massive amounts of data at scale and speed to provide crucial insights to the end customer? This is exactly what we, the Supply and Partner Data Engineering (SPDE) group in Expedia, do. Our mission is 'transforming Expedia’s lodging data assets into Data Products that deliver intelligence and real-time insights for our customers'. We work on building data assets and products to support a variety of applications which are used by 1000+ market managers, analysts, and external hotel partners. We believe in being Different. We seek new ideas, different ways of thinking, diverse backgrounds and approaches, because averages can lie and sameness is dangerous. In this role, you will: You will work with a team of backend and data engineers to design and code large scale real-time data pipelines, microservices and APIs on the AWS platform. You will be accountable for individual tasks and assignments as well as your team's overall productivity. You will define, develop and maintain artifacts like technical design or user documentation and look for continuous improvement in software and development process within an agile development team You will communicate and work effectively with geographically distributed multi-functional teams. Experience and qualifications: Bachelor's or master's degree in a related technical field; or equivalent related professional experience You have min 4+ years of meaningful work experience undistributed computing and server side projects. You are comfortable programming in Scala, or Java and have hands-on experience in OOAD, design patterns, and SQL. You are passionate about learning, especially in the areas of micro-services, APIs and system architecture You have couple of years of experience in crafting real-time streaming applications, preferably in Spark, Kafka, or other streaming platforms Experience of using cloud services (e.g. AWS) You have experience working with Agile/Scrum methodologies. You are familiar with the e-commerce or travel industry. Accommodation requests If you need assistance with any part of the application or recruiting process due to a disability, or other physical or mental health conditions, please reach out to our Recruiting Accommodations Team through the Accommodation Request. We are proud to be named as a Best Place to Work on Glassdoor in 2024 and be recognized for award-winning culture by organizations like Forbes, TIME, Disability:IN, and others. Expedia Group's family of brands includes: Brand Expedia®, Hotels.com®, Expedia® Partner Solutions, Vrbo®, trivago®, Orbitz®, Travelocity®, Hotwire®, Wotif®, ebookers®, CheapTickets®, Expedia Group™ Media Solutions, Expedia Local Expert®, CarRentals.com™, and Expedia Cruises™. © 2024 Expedia, Inc. All rights reserved. Trademarks and logos are the property of their respective owners. CST: 2029030-50 Employment opportunities and job offers at Expedia Group will always come from Expedia Group’s Talent Acquisition and hiring teams. Never provide sensitive, personal information to someone unless you’re confident who the recipient is. Expedia Group does not extend job offers via email or any other messaging tools to individuals with whom we have not made prior contact. Our email domain is @expediagroup.com. The official website to find and apply for job openings at Expedia Group is careers.expediagroup.com/jobs. Expedia is committed to creating an inclusive work environment with a diverse workforce. All qualified applicants will receive consideration for employment without regard to race, religion, gender, sexual orientation, national origin, disability or age.

Posted 4 days ago

Apply

5.0 years

4 - 10 Lacs

Gurgaon

On-site

Expedia Group brands power global travel for everyone, everywhere. We design cutting-edge tech to make travel smoother and more memorable, and we create groundbreaking solutions for our partners. Our diverse, vibrant, and welcoming community is essential in driving our success. Why Join Us? To shape the future of travel, people must come first. Guided by our Values and Leadership Agreements, we foster an open culture where everyone belongs, differences are celebrated and know that when one of us wins, we all win. We provide a full benefits package, including exciting travel perks, generous time-off, parental leave, a flexible work model (with some pretty cool offices), and career development resources, all to fuel our employees' passion for travel and ensure a rewarding career journey. We’re building a more open world. Join us. Introduction to team Our Expedia Product & Technology division builds innovative products, services, and tools to deliver high-quality experiences for travellers, partners, and our employees. A unified, singular technology platform powered by data and machine learning provides secure, differentiated, and personalised experiences for the traveler and our partners that drive loyalty and customer satisfaction. The goal of Media Solutions(MeSo) Tech team is to spearhead advertising products and services across many Expedia brands including BEX, Hotels.com, Portfolio Brands (COMET & Hotwire.com), Expedia Partner Sites (EPS), and Vrbo. We help our advertisers identify travellers on EG sites, target specific traveller criteria, and then deliver the most relevant products. As a Senior Software Development Engineer, you will propose, design and implement various initiatives. As a member of the team, you will work in alliance with global teams providing the technical expertise needed to overcome hard problems. We value rigor and innovative thinking in our development process and believe in the power of a motivated & agile development team. What you’ll do: Join a high-performing team and have a unique opportunity to make a highly visible impact Learn best practices and how to constantly raise the bar in terms of engineering excellence Identify inefficiencies in code or systems operation and offer suggestions for improvements Expand your skills in developing high quality, distributed and scalable software Share new skills and knowledge with the team to increase efficiency Write code that is clean, maintainable, and optimized with good naming conventions Develop fast, scalable, and highly available services Participate in code reviews and pull requests Who you are: 5+ years software development work experience using modern languages i.e. Java/Kotlin Bachelor’s in computer science or related technical field; or equivalent related professional experience Experience in Java, Scala, Kotlin, AWS, Kafka, S3, Lambda, Docker, Datadog Problem solver with a good understanding of algorithms, data structures, and distributed applications Solid understanding of Object-Oriented Programming concepts, data structure, algorithms, and test-driven development Solid understanding of load balancing, caching, database partitioning, caching to improve application scalability Demonstrated ability to develop and support large-sized internet-scale software systems Experience in AWS Service Knowledge of No-SQL databases and cloud computing concepts Sound understanding of client-side optimization best practices Ability to quickly pick up new technologies, languages with ease Working knowledge of Agile Software Development methodologies Strong verbal and written communication skills Passionate about quality of work Accommodation requests If you need assistance with any part of the application or recruiting process due to a disability, or other physical or mental health conditions, please reach out to our Recruiting Accommodations Team through the Accommodation Request. We are proud to be named as a Best Place to Work on Glassdoor in 2024 and be recognized for award-winning culture by organizations like Forbes, TIME, Disability:IN, and others. Expedia Group's family of brands includes: Brand Expedia®, Hotels.com®, Expedia® Partner Solutions, Vrbo®, trivago®, Orbitz®, Travelocity®, Hotwire®, Wotif®, ebookers®, CheapTickets®, Expedia Group™ Media Solutions, Expedia Local Expert®, CarRentals.com™, and Expedia Cruises™. © 2024 Expedia, Inc. All rights reserved. Trademarks and logos are the property of their respective owners. CST: 2029030-50 Employment opportunities and job offers at Expedia Group will always come from Expedia Group’s Talent Acquisition and hiring teams. Never provide sensitive, personal information to someone unless you’re confident who the recipient is. Expedia Group does not extend job offers via email or any other messaging tools to individuals with whom we have not made prior contact. Our email domain is @expediagroup.com. The official website to find and apply for job openings at Expedia Group is careers.expediagroup.com/jobs. Expedia is committed to creating an inclusive work environment with a diverse workforce. All qualified applicants will receive consideration for employment without regard to race, religion, gender, sexual orientation, national origin, disability or age.

Posted 4 days ago

Apply

0 years

0 Lacs

India

Remote

Role: NIFI Developer Notice period: Notice Serving Candidates or Immediate Joiners Preferred Client: Marriott Payroll: Dminds Work Mode: Remote I nterview Mode: Virtual We’re looking for someone who has built deployed and maintained NIFI clusters. Roles & Responsibilities: ·Implemented solutions utilizing Advanced AWS Components: EMR, EC2, etc integrated with Big Data/Hadoop Distribution Frameworks: Zookeeper, Yarn, Spark, Scala, NiFi etc. ·Designed and Implemented Spark Jobs to be deployed and run on existing Active clusters. ·Configured Postgres Database on EC2 instances and made sure application that was created is up and running, Trouble Shooted issues to meet the desired application state. ·Experience in creating and configuring secure VPC, Subnets, and Security Groups through private and public networks. ·Created alarms, alerts, notifications for Spark Jobs to email and slack group message job status and log in CloudWatch. ·NiFi data Pipeline to process large set of data and configured Lookup’s for Data Validation and Integrity. ·generation large set of test data with data integrity using java which used in Development and QA Phase. ·Spark Scala, improving the performance and optimized of the existing applications running on EMR cluster. ·Spark Job to Convert CSV data to Custom HL7/FHIR objects using FHIR API’s. ·Deployed SNS, SQS, Lambda function, IAM Roles, Custom Policies, EMR with Spark and Hadoop setup and bootstrap scripts to setup additional software’s needed to perform the job in QA and Production Environment using Terraform Scripts. ·Spark Job to perform Change Data Capture (CDC) on Postgres Tables and updated target tables using JDBC properties. ·Kafka Publisher integrated in spark job to capture errors from Spark Application and push into Postgres table. ·extensively on building Nifi data pipelines in docker container environment in development phase. ·Devops team to Clusterize NIFI Pipeline on EC2 nodes integrated with Spark, Kafka, Postgres running on other instances using SSL handshakes in QA and Production Environments.

Posted 4 days ago

Apply

0 years

0 Lacs

Ghaziabad, Uttar Pradesh, India

Remote

Job Description Position: Data Engineer Intern Location: Remote Duration: 2-6 months Company: Collegepur Type: Unpaid Internship About the Internship: We are seeking a skilled Data Engineer to join our team, with a focus on cloud data storage, ETL processes, and database/data warehouse management. If you are passionate about building robust data solutions and enabling data-driven decision-making, we want to hear from you! Key Responsibilities: 1. Design, develop, and maintain scalable data pipelines to process large datasets from multiple sources, both structured and unstructured. 2. Implement and optimize ETL (Extract, Transform, Load) processes to integrate, clean, and transform data for analytical use. 3. Manage and enhance cloud-based data storage solutions, including data lakes and data warehouses, using platforms such as AWS, Azure, or Google Cloud. 4. Ensure data security, privacy, and compliance with relevant standards and regulations. 5. Collaborate with data scientists, analysts, and software engineers to support data-driven projects and business processes. 6. Monitor and troubleshoot data pipelines to ensure efficient, real-time, and batch data processing. 7. Maintain comprehensive documentation and data mapping across multiple systems. Requirements: 1. Proven experience with cloud platforms (AWS, Azure, or Google Cloud). 2. Strong knowledge of database systems, data warehousing, and data modeling. 3. Proficiency in programming languages such as Python, Java, or Scala. 4. Experience with ETL tools and frameworks (e.g., Airflow, Informatica, Talend). 5. Familiarity with data security, compliance, and governance practices. 6. Excellent analytical, problem-solving, and communication skills. 7. Bachelor’s degree in Computer Science, Information Technology, or related field.

Posted 4 days ago

Apply

15.0 years

0 Lacs

Calcutta

On-site

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Apache Spark Good to have skills : Java, Scala, PySpark Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary: As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the data architecture. You will be involved in analyzing requirements, proposing solutions, and ensuring that the data platform aligns with organizational goals and standards. Your role will require you to stay updated with industry trends and best practices to contribute effectively to the team. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Engage in continuous learning to stay abreast of emerging technologies and methodologies. - Collaborate with cross-functional teams to gather requirements and translate them into technical specifications. Professional & Technical Skills: - Must To Have Skills: Proficiency in Apache Spark. - Good To Have Skills: Experience with Java, Scala, PySpark. - Strong understanding of data processing frameworks and distributed computing. - Experience with data integration tools and techniques. - Familiarity with cloud platforms and services related to data engineering. Additional Information: - The candidate should have minimum 3 years of experience in Apache Spark. - This position is based at our Kolkata office. - A 15 years full time education is required. 15 years full time education

Posted 4 days ago

Apply

4.0 years

0 Lacs

Kochi, Kerala, India

On-site

Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and Azure Cloud Data Platform Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark and Hive, Hbase or other NoSQL databases on Azure Cloud Data Platform or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / Azure eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc. Preferred Education Master's Degree Required Technical And Professional Expertise Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala; Minimum 3 years of experience on Cloud Data Platforms on Azure; Experience in DataBricks / Azure HDInsight / Azure Data Factory, Synapse, SQL Server DB Good to excellent SQL skills Preferred Technical And Professional Experience Certification in Azure and Data Bricks or Cloudera Spark Certified developers Experience in DataBricks / Azure HDInsight / Azure Data Factory, Synapse, SQL Server DB Knowledge or experience of Snowflake will be an added advantage

Posted 4 days ago

Apply

10.0 years

0 Lacs

Kanayannur, Kerala, India

On-site

At EY, we’re all in to shape your future with confidence. We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. EY GDS – Data and Analytics (D&A) – Cloud Architect - Manager As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for SeniorManagers (GTM +Cloud/ Big Data Architects) with strong technology and data understanding having proven delivery capability in delivery and pre sales. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Have proven experience in driving Analytics GTM/Pre-Sales by collaborating with senior stakeholder/s in the client and partner organization in BCM, WAM, Insurance. Activities will include pipeline building, RFP responses, creating new solutions and offerings, conducting workshops as well as managing in flight projects focused on cloud and big data. Need to work with client in converting business problems/challenges to technical solutions considering security, performance, scalability etc. [ 10- 15 years] Need to understand current & Future state enterprise architecture. Need to contribute in various technical streams during implementation of the project. Provide product and design level technical best practices Interact with senior client technology leaders, understand their business goals, create, architect, propose, develop and deliver technology solutions Define and develop client specific best practices around data management within a Hadoop environment or cloud environment Recommend design alternatives for data ingestion, processing and provisioning layers Design and develop data ingestion programs to process large data sets in Batch mode using HIVE, Pig and Sqoop, Spark Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Skills And Attributes For Success Architect in designing highly scalable solutions Azure, AWS and GCP. Strong understanding & familiarity with all Azure/AWS/GCP /Bigdata Ecosystem components Strong understanding of underlying Azure/AWS/GCP Architectural concepts and distributed computing paradigms Hands-on programming experience in Apache Spark using Python/Scala and Spark Streaming Hands on experience with major components like cloud ETLs,Spark, Databricks Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Knowledge of Spark and Kafka integration with multiple Spark jobs to consume messages from multiple Kafka partitions Solid understanding of ETL methodologies in a multi-tiered stack, integrating with Big Data systems like Cloudera and Databricks. Strong understanding of underlying Hadoop Architectural concepts and distributed computing paradigms Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Good knowledge in apache Kafka & Apache Flume Experience in Enterprise grade solution implementations. Experience in performance bench marking enterprise applications Experience in Data security [on the move, at rest] Strong UNIX operating system concepts and shell scripting knowledge To qualify for the role, you must have Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support Responsible for the evaluation of technical risks and map out mitigation strategies Experience in Data security [on the move, at rest] Experience in performance bench marking enterprise applications Working knowledge in any of the cloud platform, AWS or Azure or GCP Excellent business communication, Consulting, Quality process skills Excellent Consulting Skills Excellence in leading Solution Architecture, Design, Build and Execute for leading clients in Banking, Wealth Asset Management, or Insurance domain. Minimum 7 years hand-on experience in one or more of the above areas. Minimum 10 years industry experience Ideally, you’ll also have Strong project management skills Client management skills Solutioning skills What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets. Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow. EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.

Posted 4 days ago

Apply

10.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

At EY, we’re all in to shape your future with confidence. We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. EY GDS – Data and Analytics (D&A) – Cloud Architect - Manager As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for SeniorManagers (GTM +Cloud/ Big Data Architects) with strong technology and data understanding having proven delivery capability in delivery and pre sales. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Have proven experience in driving Analytics GTM/Pre-Sales by collaborating with senior stakeholder/s in the client and partner organization in BCM, WAM, Insurance. Activities will include pipeline building, RFP responses, creating new solutions and offerings, conducting workshops as well as managing in flight projects focused on cloud and big data. Need to work with client in converting business problems/challenges to technical solutions considering security, performance, scalability etc. [ 10- 15 years] Need to understand current & Future state enterprise architecture. Need to contribute in various technical streams during implementation of the project. Provide product and design level technical best practices Interact with senior client technology leaders, understand their business goals, create, architect, propose, develop and deliver technology solutions Define and develop client specific best practices around data management within a Hadoop environment or cloud environment Recommend design alternatives for data ingestion, processing and provisioning layers Design and develop data ingestion programs to process large data sets in Batch mode using HIVE, Pig and Sqoop, Spark Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Skills And Attributes For Success Architect in designing highly scalable solutions Azure, AWS and GCP. Strong understanding & familiarity with all Azure/AWS/GCP /Bigdata Ecosystem components Strong understanding of underlying Azure/AWS/GCP Architectural concepts and distributed computing paradigms Hands-on programming experience in Apache Spark using Python/Scala and Spark Streaming Hands on experience with major components like cloud ETLs,Spark, Databricks Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Knowledge of Spark and Kafka integration with multiple Spark jobs to consume messages from multiple Kafka partitions Solid understanding of ETL methodologies in a multi-tiered stack, integrating with Big Data systems like Cloudera and Databricks. Strong understanding of underlying Hadoop Architectural concepts and distributed computing paradigms Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Good knowledge in apache Kafka & Apache Flume Experience in Enterprise grade solution implementations. Experience in performance bench marking enterprise applications Experience in Data security [on the move, at rest] Strong UNIX operating system concepts and shell scripting knowledge To qualify for the role, you must have Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support Responsible for the evaluation of technical risks and map out mitigation strategies Experience in Data security [on the move, at rest] Experience in performance bench marking enterprise applications Working knowledge in any of the cloud platform, AWS or Azure or GCP Excellent business communication, Consulting, Quality process skills Excellent Consulting Skills Excellence in leading Solution Architecture, Design, Build and Execute for leading clients in Banking, Wealth Asset Management, or Insurance domain. Minimum 7 years hand-on experience in one or more of the above areas. Minimum 10 years industry experience Ideally, you’ll also have Strong project management skills Client management skills Solutioning skills What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets. Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow. EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.

Posted 4 days ago

Apply

0 years

0 Lacs

India

On-site

Job Description: We are seeking a highly skilled 4+ Azure Data Engineer to design, develop, and optimize data pipelines and data integration solutions in a cloud-based environment. The ideal candidate will have strong technical expertise in Azure, Data Engineering tools, and advanced ETL design along with excellent communication and problem-solving skills. Key Responsibilities: Design and develop advanced ETL pipelines for data ingestion and egress for batch data. Build scalable data solutions using Azure Data Factory (ADF) , Databricks , Spark (PySpark & Scala Spark) , and other Azure services. Troubleshoot data jobs, identify issues, and implement effective root cause solutions. Collaborate with stakeholders to gather requirements and propose efficient solution designs. Ensure data quality, reliability, and adherence to best practices in data engineering. Maintain detailed documentation of problem definitions, solutions, and architecture. Work independently with minimal supervision while ensuring project deadlines are met. Required Skills & Qualifications: Microsoft Certified: Azure Fundamentals (preferred). Microsoft Certified: Azure Data Engineer Associate (preferred). Proficiency in SQL , Python , and Scala . Strong knowledge of Azure Cloud services , ADF , and Databricks . Hands-on experience with Apache Spark (PySpark & Scala Spark). Expertise in designing and implementing complex ETL pipelines for batch data. Strong troubleshooting skills with the ability to perform root cause analysis. Soft Skills: Excellent verbal and written communication skills. Strong documentation skills for drafting problem definitions and solutions. Ability to effectively gather requirements and propose solution designs. Self-driven with the ability to work independently with minimal supervision.

Posted 4 days ago

Apply

15.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

At eBay, we're more than a global ecommerce leader — we’re changing the way the world shops and sells. Our platform empowers millions of buyers and sellers in more than 190 markets around the world. We’re committed to pushing boundaries and leaving our mark as we reinvent the future of ecommerce for enthusiasts. Our customers are our compass, authenticity thrives, bold ideas are welcome, and everyone can bring their unique selves to work — every day. We're in this together, sustaining the future of our customers, our company, and our planet. Join a team of passionate thinkers, innovators, and dreamers — and help us connect people and build communities to create economic opportunity for all. Ready to shape the future of global commerce with bold ideas and groundbreaking technology? At eBay, we’re more than just a marketplace — we’re a vibrant, purpose-driven community built on passion, courage, and creativity. Every day, we empower millions of people to buy, sell, connect, and thrive. If you're looking to make an impact in a company that values innovation and inclusivity, eBay is a place you'll be proud to call home. We’re searching for a visionary AI leader to spearhead a team of world-class applied researchers and engineers. Your mission? To design and deliver transformative machine learning and generative AI solutions at eBay scale — from cutting-edge, personalized product recommendations for millions of users, to unlocking deep semantic understanding of over two billion listings, to building immersive and intelligent shopping experiences that have never been seen before. In this role, you’ll collaborate with top minds across our global Recommendations and Buyer Experience AI organization — including product managers, designers, and analytics leaders — to reimagine what’s possible in personalized e-commerce. If you’re passionate about leading with purpose and building AI that matters, we’d love to meet you. This Is An Opportunity To Lead and manage a large team of applied researchers and engineers with deep expertise in natural language processing, large language models / Generative AI, recommender systems, and ML production engineering Drive applied research strategy for the eBay buyer experience, and influence how people will interact with eCommerce in the future Work with unique and large data sets of unstructured multimodal data representing eBay's vast and varied inventory, and millions of users Develop and deploy state of the art AI models Deploy big data technology and large scale data pipelines Drive marketplace GMB as well as advertising revenue via organic and sponsored recommendations Create a culture of applied research, innovation, experimentation and engineering excellence Qualifications Advanced degree (MS or PhD) in Computer Science or a related field, with 15 years of experience in Machine Learning, AI, or large-scale engineering environments. Proven track record of building and leading high-impact engineering and research teams, ideally within ML/AI-focused organizations. Experience with a variety of data science and ML techniques, exposure to Natural Language Processing (NLP) and industrial-grade recommender systems, with a passion for solving real-world problems at scale. Solid background in production-level engineering practices, including Agile methodologies and object-oriented programming (e.g., Scala, Java), in high-throughput environments. Proven track record of having business impact with production grade cloud-native solutions, scalable data pipelines, and large-scale distributed databases. A history of technical thought leadership through academic publications, patents, or contributions to open-source projects or technical blogs is highly desirable. Links To Some Of Our Previous Work Tech Blog 2025 (Multimodal GenAI) Tech Blog 2025 (GenAI Agentic Platform) RecSys 2024 Workshop paper Google Cloud Blog 2024 eBay Tech Blog 2023 eBay Tech Blog 2022 RecSys 2021 paper Please see the Talent Privacy Notice for information regarding how eBay handles your personal data collected when you use the eBay Careers website or apply for a job with eBay. eBay is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, sex, sexual orientation, gender identity, veteran status, and disability, or other legally protected status. If you have a need that requires accommodation, please contact us at talent@ebay.com. We will make every effort to respond to your request for accommodation as soon as possible. View our accessibility statement to learn more about eBay's commitment to ensuring digital accessibility for people with disabilities. The eBay Jobs website uses cookies to enhance your experience. By continuing to browse the site, you agree to our use of cookies. Visit our Privacy Center for more information.

Posted 4 days ago

Apply

5.0 years

0 Lacs

India

On-site

Key Skills Required: 5+ years of experience in Software Engineering and MLOps Strong development experience on AWS, specifically AWS Sagemaker (mandatory) Experience with MLflow, GitLab, CDK (mandatory) Exposure to AWS Data Zone (preferred) Proficient in at least one general-purpose programming language: Python, R, Scala, Spark Hands-on experience with production-grade development, integration, and support Strong adherence to scalable, secure, and reliable application development best practices Candidate should have a strong analytical mindset and contribute to MLOps research initiatives

Posted 4 days ago

Apply

0.0 years

0 - 0 Lacs

Thiruvananthapuram, Kerala

On-site

Data Science and AI Developer **Job Description:** We are seeking a highly skilled and motivated Data Science and AI Developer to join our dynamic team. As a Data Science and AI Developer, you will be responsible for leveraging cutting-edge technologies to develop innovative solutions that drive business insights and enhance decision-making processes. **Key Responsibilities:** 1. Develop and deploy machine learning models for predictive analytics, classification, clustering, and anomaly detection. 2. Design and implement algorithms for data mining, pattern recognition, and natural language processing. 3. Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions. 4. Utilize advanced statistical techniques to analyze complex datasets and extract actionable insights. 5. Implement scalable data pipelines for data ingestion, preprocessing, feature engineering, and model training. 6. Stay updated with the latest advancements in data science, machine learning, and artificial intelligence research. 7. Optimize model performance and scalability through experimentation and iteration. 8. Communicate findings and results to stakeholders through reports, presentations, and visualizations. 9. Ensure compliance with data privacy regulations and best practices in data handling and security. 10. Mentor junior team members and provide technical guidance and support. **Requirements:** 1. Bachelor’s or Master’s degree in Computer Science, Data Science, Statistics, or a related field. 2. Proven experience in developing and deploying machine learning models in production environments. 3. Proficiency in programming languages such as Python, R, or Scala, with strong software engineering skills. 4. Hands-on experience with machine learning libraries/frameworks such as TensorFlow, PyTorch, Scikit-learn, or Spark MLlib. 5. Solid understanding of data structures, algorithms, and computer science fundamentals. 6. Excellent problem-solving skills and the ability to think creatively to overcome challenges. 7. Strong communication and interpersonal skills, with the ability to work effectively in a collaborative team environment. 8. Certification in Data Science, Machine Learning, or Artificial Intelligence (e.g., Coursera, edX, Udacity, etc.). 9. Experience with cloud platforms such as AWS, Azure, or Google Cloud is a plus. 10. Familiarity with big data technologies (e.g., Hadoop, Spark, Kafka) is an advantage. Data Manipulation and Analysis : NumPy, Pandas Data Visualization : Matplotlib, Seaborn, Power BI Machine Learning Libraries : Scikit-learn, TensorFlow, Keras Statistical Analysis : SciPy Web Scrapping : Scrapy IDE : PyCharm, Google Colab HTML/CSS/JavaScript/React JS Proficiency in these core web development technologies is a must. Python Django Expertise: In-depth knowledge of e-commerce functionalities or deep Python Django knowledge. Theming: Proven experience in designing and implementing custom themes for Python websites. Responsive Design: Strong understanding of responsive design principles and the ability to create visually appealing and user-friendly interfaces for various devices. Problem Solving: Excellent problem-solving skills with the ability to troubleshoot and resolve issues independently. Collaboration: Ability to work closely with cross-functional teams, including marketing and design, to bring creative visions to life. interns must know about how to connect front end with datascience Also must Know to connect datascience to frontend **Benefits:** - Competitive salary package - Flexible working hours - Opportunities for career growth and professional development - Dynamic and innovative work environment Job Type: Full-time Pay: ₹8,000.00 - ₹12,000.00 per month Ability to commute/relocate: Thiruvananthapuram, Kerala: Reliably commute or planning to relocate before starting work (Preferred) Work Location: In person

Posted 4 days ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Minimum qualifications: Bachelor’s degree or equivalent practical experience. 5 years of experience with software development in one or more programming languages. 3 years of experience testing, maintaining, or launching software products, and 1 year of experience with software design and architecture. Preferred qualifications: Master's degree or PhD in Computer Science or related technical field. 5 years of experience with data structures/algorithms. Experience working on Linux, APIs and Services. Knowledge of programming languages such as Java, scala and C++. Ability to define software architecture, components, modules, interfaces, and data for a system to meet requirements, validating for correctness, functionality and reliability. About The Job Google's software engineers develop the next-generation technologies that change how billions of users connect, explore, and interact with information and one another. Our products need to handle information at massive scale, and extend well beyond web search. We're looking for engineers who bring fresh ideas from all areas, including information retrieval, distributed computing, large-scale system design, networking and data storage, security, artificial intelligence, natural language processing, UI design and mobile; the list goes on and is growing every day. As a software engineer, you will work on a specific project critical to Google’s needs with opportunities to switch teams and projects as you and our fast-paced business grow and evolve. We need our engineers to be versatile, display leadership qualities and be enthusiastic to take on new problems across the full-stack as we continue to push technology forward. The Google Home team focuses on hardware, software and services offerings for the home, ranging from thermostats to smart displays. The Home team researches, designs, and develops new technologies and hardware to make users’ homes more helpful. Our mission is the helpful home: to create a home that cares for the people inside it and the world around it. Responsibilities Contribute to existing documentation or educational content and adapt content based on product/program updates and user feedback. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on hardware, network, or service operations and quality. Develop scalable, reliable, and high-performance solutions that cater to the growing needs of the customers. Manage all technical aspects of development, unit testing, integration and deployments. Collaborate with peers and work with teammates to identify opportunities to make the Product/Features scalable. Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See also Google's EEO Policy and EEO is the Law. If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form .

Posted 4 days ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Teamwork makes the stream work. Roku is changing how the world watches TV Roku is the #1 TV streaming platform in the U.S., Canada, and Mexico, and we've set our sights on powering every television in the world. Roku pioneered streaming to the TV. Our mission is to be the TV streaming platform that connects the entire TV ecosystem. We connect consumers to the content they love, enable content publishers to build and monetize large audiences, and provide advertisers unique capabilities to engage consumers. From your first day at Roku, you'll make a valuable - and valued - contribution. We're a fast-growing public company where no one is a bystander. We offer you the opportunity to delight millions of TV streamers around the world while gaining meaningful experience across a variety of disciplines. About the Team: The Data Foundations team plays a critical role in supporting Roku Ads business intelligence and analytics . The team is responsible for developing and managing foundational datasets designed to serve the operational and analytical needs of the broader organization. The team's mission is carried out through three focus areas: acting as the interface between data producers and consumers, simplifying data architecture, and creating tools in a standardized way . About the Role: We are seeking a talented and experienced Senior Software Engineer with a strong background in big data technologies, including Apache Spark and Apache Airflow. This hybrid role bridges software and data engineering, requiring expertise in designing, building, and maintaining scalable systems for both application development and data processing. You will collaborate with cross-functional teams to design and manage robust, production-grade, large-scale data systems. The ideal candidate is a proactive self-starter with a deep understanding of high-scale data services and a commitment to excellence. What you’ll be doing Software Development: Write clean, maintainable, and efficient code, ensuring adherence to best practices through code reviews. Big Data Engineering: Design, develop, and maintain data pipelines and ETL workflows using Apache Spark, Apache Airflow. Optimize data storage, retrieval, and processing systems to ensure reliability, scalability, and performance. Develop and fine-tune complex queries and data processing jobs for large-scale datasets. Monitor, troubleshoot, and improve data systems for minimal downtime and maximum efficiency. Collaboration & Mentorship: Partner with data scientists, software engineers, and other teams to deliver integrated, high-quality solutions. Provide technical guidance and mentorship to junior engineers, promoting best practices in data engineering. We’re excited if you have Bachelor's degree in Computer Science, Engineering, or a related field (or equivalent experience). 5+ years of experience in software and/or data engineering with expertise in big data technologies such as Apache Spark, Apache Airflow and Trino. Strong understanding of SOLID principles and distributed systems architecture. Proven experience in distributed data processing, data warehousing, and real-time data pipelines. Advanced SQL skills, with expertise in query optimization for large datasets. Exceptional problem-solving abilities and the capacity to work independently or collaboratively. Excellent verbal and written communication skills. Experience with cloud platforms such as AWS, GCP, or Azure, and containerization tools like Docker and Kubernetes. (preferred) Familiarity with additional big data technologies, including Hadoop, Kafka, and Presto. (preferred) Strong programming skills in Python, Java, or Scala. (preferred) Knowledge of CI/CD pipelines, DevOps practices, and infrastructure-as-code tools (e.g., Terraform). (preferred) Expertise in data modeling, schema design, and data visualization tools. (preferred) AI literacy and curiosity.You have either tried Gen AI in your previous work or outside of work or are curious about Gen AI and have explored it. Benefits Roku is committed to offering a diverse range of benefits as part of our compensation package to support our employees and their families. Our comprehensive benefits include global access to mental health and financial wellness support and resources. Local benefits include statutory and voluntary benefits which may include healthcare (medical, dental, and vision), life, accident, disability, commuter, and retirement options (401(k)/pension). Our employees can take time off work for vacation and other personal reasons to balance their evolving work and life needs. It's important to note that not every benefit is available in all locations or for every role. For details specific to your location, please consult with your recruiter. The Roku Culture Roku is a great place for people who want to work in a fast-paced environment where everyone is focused on the company's success rather than their own. We try to surround ourselves with people who are great at their jobs, who are easy to work with, and who keep their egos in check. We appreciate a sense of humor. We believe a fewer number of very talented folks can do more for less cost than a larger number of less talented teams. We're independent thinkers with big ideas who act boldly, move fast and accomplish extraordinary things through collaboration and trust. In short, at Roku you'll be part of a company that's changing how the world watches TV. We have a unique culture that we are proud of. We think of ourselves primarily as problem-solvers, which itself is a two-part idea. We come up with the solution, but the solution isn't real until it is built and delivered to the customer. That penchant for action gives us a pragmatic approach to innovation, one that has served us well since 2002. To learn more about Roku, our global footprint, and how we've grown, visit https://www.weareroku.com/factsheet. By providing your information, you acknowledge that you have read our Applicant Privacy Notice and authorize Roku to process your data subject to those terms.

Posted 4 days ago

Apply

2.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Line of Service Advisory Industry/Sector FS X-Sector Specialism Risk Management Level Associate Job Description & Summary In-depth knowledge of application development processes and at least one programming and one scripting language (e.g., Java, Scala, C#, JavaScript, Angular, ReactJs, Ruby, Perl, Python, Shell). Knowledge on OS security (Windows, Unix/Linux systems, Mac OS, VMware), network security and cloud security. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: We are seeking a professional to join our Cybersecurity and Privacy services team, where you will have the opportunity to help clients implement effective cybersecurity programs that protect against threats. Responsibilities L1 - Minimum 2 years of relevant experience in SOC/Incident Management/Incident Response /Threat Detection Engineering/ Vulnerability Management/ SOC platform management/ Automation/Asset Integration/ Threat Intel Management /Threat Hunting. L2 - Minimum 4 years of relevant experience in SOC/Incident Management/Incident Response /Threat Detection Engineering/Vulnerability Management/ SOC platform management/ Automation/ Asset Integration/ Threat Intel Management/Threat Hunting. Round the clock threat monitoring & detection Analysis of any suspicious, malicious, and abnormal behavior. Alert triage, Initial assessment, incident validation, its severity & urgency Prioritization of security alerts and creating Incidents as per SOPs. Reporting & escalation to stakeholders Post-incident Analysis Consistent incident triage & recommendations using playbooks. Develop & maintain incident management and incident response policies and procedures. Preservation of security alerts and security incidents artefacts for forensic purpose. Adherence to Service Level Agreements (SLA) and KPIs. Reduction in Mean Time to Detection and Response (MTTD & MTTR). Mandatory Skill Sets Certified SOC Analyst (EC-Council), Computer Hacking Forensic Investigator (EC-Council), Certified Ethical Hacker (EC-Council), CompTIA Security+, CompTIA CySA+ (Cybersecurity Analyst), GIAC Certified Incident Handler (GCIH) or equivalent. Product Certifications (Preferred): - Product Certifications on SOC Security Tools such as SIEM/Vulnerability Management/ DAM/UBA/ SOAR/NBA etc. Preferred Skill Sets SOC - Splunk Years Of Experience Required 2-5 Years Education Qualification B.Tech/MCA/MBA with IT background/ Bachelor’s degree in Information Technology, Cybersecurity, Computer Science a Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Business Administration, Bachelor of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills SOC Operations Optional Skills SoCs Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 4 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies