Jobs
Interviews

6030 Scala Jobs - Page 8

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

15.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

At eBay, we're more than a global ecommerce leader — we’re changing the way the world shops and sells. Our platform empowers millions of buyers and sellers in more than 190 markets around the world. We’re committed to pushing boundaries and leaving our mark as we reinvent the future of ecommerce for enthusiasts. Our customers are our compass, authenticity thrives, bold ideas are welcome, and everyone can bring their unique selves to work — every day. We're in this together, sustaining the future of our customers, our company, and our planet. Join a team of passionate thinkers, innovators, and dreamers — and help us connect people and build communities to create economic opportunity for all. Ready to shape the future of global commerce with bold ideas and groundbreaking technology? At eBay, we’re more than just a marketplace — we’re a vibrant, purpose-driven community built on passion, courage, and creativity. Every day, we empower millions of people to buy, sell, connect, and thrive. If you're looking to make an impact in a company that values innovation and inclusivity, eBay is a place you'll be proud to call home. We’re searching for a visionary AI leader to spearhead a team of world-class applied researchers and engineers. Your mission? To design and deliver transformative machine learning and generative AI solutions at eBay scale — from cutting-edge, personalized product recommendations for millions of users, to unlocking deep semantic understanding of over two billion listings, to building immersive and intelligent shopping experiences that have never been seen before. In this role, you’ll collaborate with top minds across our global Recommendations and Buyer Experience AI organization — including product managers, designers, and analytics leaders — to reimagine what’s possible in personalized e-commerce. If you’re passionate about leading with purpose and building AI that matters, we’d love to meet you. This Is An Opportunity To Lead and manage a large team of applied researchers and engineers with deep expertise in natural language processing, large language models / Generative AI, recommender systems, and ML production engineering Drive applied research strategy for the eBay buyer experience, and influence how people will interact with eCommerce in the future Work with unique and large data sets of unstructured multimodal data representing eBay's vast and varied inventory, and millions of users Develop and deploy state of the art AI models Deploy big data technology and large scale data pipelines Drive marketplace GMB as well as advertising revenue via organic and sponsored recommendations Create a culture of applied research, innovation, experimentation and engineering excellence Qualifications Advanced degree (MS or PhD) in Computer Science or a related field, with 15 years of experience in Machine Learning, AI, or large-scale engineering environments. Proven track record of building and leading high-impact engineering and research teams, ideally within ML/AI-focused organizations. Experience with a variety of data science and ML techniques, exposure to Natural Language Processing (NLP) and industrial-grade recommender systems, with a passion for solving real-world problems at scale. Solid background in production-level engineering practices, including Agile methodologies and object-oriented programming (e.g., Scala, Java), in high-throughput environments. Proven track record of having business impact with production grade cloud-native solutions, scalable data pipelines, and large-scale distributed databases. A history of technical thought leadership through academic publications, patents, or contributions to open-source projects or technical blogs is highly desirable. Links To Some Of Our Previous Work Tech Blog 2025 (Multimodal GenAI) Tech Blog 2025 (GenAI Agentic Platform) RecSys 2024 Workshop paper Google Cloud Blog 2024 eBay Tech Blog 2023 eBay Tech Blog 2022 RecSys 2021 paper Please see the Talent Privacy Notice for information regarding how eBay handles your personal data collected when you use the eBay Careers website or apply for a job with eBay. eBay is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, sex, sexual orientation, gender identity, veteran status, and disability, or other legally protected status. If you have a need that requires accommodation, please contact us at talent@ebay.com. We will make every effort to respond to your request for accommodation as soon as possible. View our accessibility statement to learn more about eBay's commitment to ensuring digital accessibility for people with disabilities. The eBay Jobs website uses cookies to enhance your experience. By continuing to browse the site, you agree to our use of cookies. Visit our Privacy Center for more information.

Posted 3 days ago

Apply

5.0 years

0 Lacs

India

On-site

Key Skills Required: 5+ years of experience in Software Engineering and MLOps Strong development experience on AWS, specifically AWS Sagemaker (mandatory) Experience with MLflow, GitLab, CDK (mandatory) Exposure to AWS Data Zone (preferred) Proficient in at least one general-purpose programming language: Python, R, Scala, Spark Hands-on experience with production-grade development, integration, and support Strong adherence to scalable, secure, and reliable application development best practices Candidate should have a strong analytical mindset and contribute to MLOps research initiatives

Posted 3 days ago

Apply

0.0 years

0 - 0 Lacs

Thiruvananthapuram, Kerala

On-site

Data Science and AI Developer **Job Description:** We are seeking a highly skilled and motivated Data Science and AI Developer to join our dynamic team. As a Data Science and AI Developer, you will be responsible for leveraging cutting-edge technologies to develop innovative solutions that drive business insights and enhance decision-making processes. **Key Responsibilities:** 1. Develop and deploy machine learning models for predictive analytics, classification, clustering, and anomaly detection. 2. Design and implement algorithms for data mining, pattern recognition, and natural language processing. 3. Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions. 4. Utilize advanced statistical techniques to analyze complex datasets and extract actionable insights. 5. Implement scalable data pipelines for data ingestion, preprocessing, feature engineering, and model training. 6. Stay updated with the latest advancements in data science, machine learning, and artificial intelligence research. 7. Optimize model performance and scalability through experimentation and iteration. 8. Communicate findings and results to stakeholders through reports, presentations, and visualizations. 9. Ensure compliance with data privacy regulations and best practices in data handling and security. 10. Mentor junior team members and provide technical guidance and support. **Requirements:** 1. Bachelor’s or Master’s degree in Computer Science, Data Science, Statistics, or a related field. 2. Proven experience in developing and deploying machine learning models in production environments. 3. Proficiency in programming languages such as Python, R, or Scala, with strong software engineering skills. 4. Hands-on experience with machine learning libraries/frameworks such as TensorFlow, PyTorch, Scikit-learn, or Spark MLlib. 5. Solid understanding of data structures, algorithms, and computer science fundamentals. 6. Excellent problem-solving skills and the ability to think creatively to overcome challenges. 7. Strong communication and interpersonal skills, with the ability to work effectively in a collaborative team environment. 8. Certification in Data Science, Machine Learning, or Artificial Intelligence (e.g., Coursera, edX, Udacity, etc.). 9. Experience with cloud platforms such as AWS, Azure, or Google Cloud is a plus. 10. Familiarity with big data technologies (e.g., Hadoop, Spark, Kafka) is an advantage. Data Manipulation and Analysis : NumPy, Pandas Data Visualization : Matplotlib, Seaborn, Power BI Machine Learning Libraries : Scikit-learn, TensorFlow, Keras Statistical Analysis : SciPy Web Scrapping : Scrapy IDE : PyCharm, Google Colab HTML/CSS/JavaScript/React JS Proficiency in these core web development technologies is a must. Python Django Expertise: In-depth knowledge of e-commerce functionalities or deep Python Django knowledge. Theming: Proven experience in designing and implementing custom themes for Python websites. Responsive Design: Strong understanding of responsive design principles and the ability to create visually appealing and user-friendly interfaces for various devices. Problem Solving: Excellent problem-solving skills with the ability to troubleshoot and resolve issues independently. Collaboration: Ability to work closely with cross-functional teams, including marketing and design, to bring creative visions to life. interns must know about how to connect front end with datascience Also must Know to connect datascience to frontend **Benefits:** - Competitive salary package - Flexible working hours - Opportunities for career growth and professional development - Dynamic and innovative work environment Job Type: Full-time Pay: ₹8,000.00 - ₹12,000.00 per month Ability to commute/relocate: Thiruvananthapuram, Kerala: Reliably commute or planning to relocate before starting work (Preferred) Work Location: In person

Posted 3 days ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Minimum qualifications: Bachelor’s degree or equivalent practical experience. 5 years of experience with software development in one or more programming languages. 3 years of experience testing, maintaining, or launching software products, and 1 year of experience with software design and architecture. Preferred qualifications: Master's degree or PhD in Computer Science or related technical field. 5 years of experience with data structures/algorithms. Experience working on Linux, APIs and Services. Knowledge of programming languages such as Java, scala and C++. Ability to define software architecture, components, modules, interfaces, and data for a system to meet requirements, validating for correctness, functionality and reliability. About The Job Google's software engineers develop the next-generation technologies that change how billions of users connect, explore, and interact with information and one another. Our products need to handle information at massive scale, and extend well beyond web search. We're looking for engineers who bring fresh ideas from all areas, including information retrieval, distributed computing, large-scale system design, networking and data storage, security, artificial intelligence, natural language processing, UI design and mobile; the list goes on and is growing every day. As a software engineer, you will work on a specific project critical to Google’s needs with opportunities to switch teams and projects as you and our fast-paced business grow and evolve. We need our engineers to be versatile, display leadership qualities and be enthusiastic to take on new problems across the full-stack as we continue to push technology forward. The Google Home team focuses on hardware, software and services offerings for the home, ranging from thermostats to smart displays. The Home team researches, designs, and develops new technologies and hardware to make users’ homes more helpful. Our mission is the helpful home: to create a home that cares for the people inside it and the world around it. Responsibilities Contribute to existing documentation or educational content and adapt content based on product/program updates and user feedback. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on hardware, network, or service operations and quality. Develop scalable, reliable, and high-performance solutions that cater to the growing needs of the customers. Manage all technical aspects of development, unit testing, integration and deployments. Collaborate with peers and work with teammates to identify opportunities to make the Product/Features scalable. Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See also Google's EEO Policy and EEO is the Law. If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form .

Posted 3 days ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Teamwork makes the stream work. Roku is changing how the world watches TV Roku is the #1 TV streaming platform in the U.S., Canada, and Mexico, and we've set our sights on powering every television in the world. Roku pioneered streaming to the TV. Our mission is to be the TV streaming platform that connects the entire TV ecosystem. We connect consumers to the content they love, enable content publishers to build and monetize large audiences, and provide advertisers unique capabilities to engage consumers. From your first day at Roku, you'll make a valuable - and valued - contribution. We're a fast-growing public company where no one is a bystander. We offer you the opportunity to delight millions of TV streamers around the world while gaining meaningful experience across a variety of disciplines. About the Team: The Data Foundations team plays a critical role in supporting Roku Ads business intelligence and analytics . The team is responsible for developing and managing foundational datasets designed to serve the operational and analytical needs of the broader organization. The team's mission is carried out through three focus areas: acting as the interface between data producers and consumers, simplifying data architecture, and creating tools in a standardized way . About the Role: We are seeking a talented and experienced Senior Software Engineer with a strong background in big data technologies, including Apache Spark and Apache Airflow. This hybrid role bridges software and data engineering, requiring expertise in designing, building, and maintaining scalable systems for both application development and data processing. You will collaborate with cross-functional teams to design and manage robust, production-grade, large-scale data systems. The ideal candidate is a proactive self-starter with a deep understanding of high-scale data services and a commitment to excellence. What you’ll be doing Software Development: Write clean, maintainable, and efficient code, ensuring adherence to best practices through code reviews. Big Data Engineering: Design, develop, and maintain data pipelines and ETL workflows using Apache Spark, Apache Airflow. Optimize data storage, retrieval, and processing systems to ensure reliability, scalability, and performance. Develop and fine-tune complex queries and data processing jobs for large-scale datasets. Monitor, troubleshoot, and improve data systems for minimal downtime and maximum efficiency. Collaboration & Mentorship: Partner with data scientists, software engineers, and other teams to deliver integrated, high-quality solutions. Provide technical guidance and mentorship to junior engineers, promoting best practices in data engineering. We’re excited if you have Bachelor's degree in Computer Science, Engineering, or a related field (or equivalent experience). 5+ years of experience in software and/or data engineering with expertise in big data technologies such as Apache Spark, Apache Airflow and Trino. Strong understanding of SOLID principles and distributed systems architecture. Proven experience in distributed data processing, data warehousing, and real-time data pipelines. Advanced SQL skills, with expertise in query optimization for large datasets. Exceptional problem-solving abilities and the capacity to work independently or collaboratively. Excellent verbal and written communication skills. Experience with cloud platforms such as AWS, GCP, or Azure, and containerization tools like Docker and Kubernetes. (preferred) Familiarity with additional big data technologies, including Hadoop, Kafka, and Presto. (preferred) Strong programming skills in Python, Java, or Scala. (preferred) Knowledge of CI/CD pipelines, DevOps practices, and infrastructure-as-code tools (e.g., Terraform). (preferred) Expertise in data modeling, schema design, and data visualization tools. (preferred) AI literacy and curiosity.You have either tried Gen AI in your previous work or outside of work or are curious about Gen AI and have explored it. Benefits Roku is committed to offering a diverse range of benefits as part of our compensation package to support our employees and their families. Our comprehensive benefits include global access to mental health and financial wellness support and resources. Local benefits include statutory and voluntary benefits which may include healthcare (medical, dental, and vision), life, accident, disability, commuter, and retirement options (401(k)/pension). Our employees can take time off work for vacation and other personal reasons to balance their evolving work and life needs. It's important to note that not every benefit is available in all locations or for every role. For details specific to your location, please consult with your recruiter. The Roku Culture Roku is a great place for people who want to work in a fast-paced environment where everyone is focused on the company's success rather than their own. We try to surround ourselves with people who are great at their jobs, who are easy to work with, and who keep their egos in check. We appreciate a sense of humor. We believe a fewer number of very talented folks can do more for less cost than a larger number of less talented teams. We're independent thinkers with big ideas who act boldly, move fast and accomplish extraordinary things through collaboration and trust. In short, at Roku you'll be part of a company that's changing how the world watches TV. We have a unique culture that we are proud of. We think of ourselves primarily as problem-solvers, which itself is a two-part idea. We come up with the solution, but the solution isn't real until it is built and delivered to the customer. That penchant for action gives us a pragmatic approach to innovation, one that has served us well since 2002. To learn more about Roku, our global footprint, and how we've grown, visit https://www.weareroku.com/factsheet. By providing your information, you acknowledge that you have read our Applicant Privacy Notice and authorize Roku to process your data subject to those terms.

Posted 3 days ago

Apply

2.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Line of Service Advisory Industry/Sector FS X-Sector Specialism Risk Management Level Associate Job Description & Summary In-depth knowledge of application development processes and at least one programming and one scripting language (e.g., Java, Scala, C#, JavaScript, Angular, ReactJs, Ruby, Perl, Python, Shell). Knowledge on OS security (Windows, Unix/Linux systems, Mac OS, VMware), network security and cloud security. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: We are seeking a professional to join our Cybersecurity and Privacy services team, where you will have the opportunity to help clients implement effective cybersecurity programs that protect against threats. Responsibilities L1 - Minimum 2 years of relevant experience in SOC/Incident Management/Incident Response /Threat Detection Engineering/ Vulnerability Management/ SOC platform management/ Automation/Asset Integration/ Threat Intel Management /Threat Hunting. L2 - Minimum 4 years of relevant experience in SOC/Incident Management/Incident Response /Threat Detection Engineering/Vulnerability Management/ SOC platform management/ Automation/ Asset Integration/ Threat Intel Management/Threat Hunting. Round the clock threat monitoring & detection Analysis of any suspicious, malicious, and abnormal behavior. Alert triage, Initial assessment, incident validation, its severity & urgency Prioritization of security alerts and creating Incidents as per SOPs. Reporting & escalation to stakeholders Post-incident Analysis Consistent incident triage & recommendations using playbooks. Develop & maintain incident management and incident response policies and procedures. Preservation of security alerts and security incidents artefacts for forensic purpose. Adherence to Service Level Agreements (SLA) and KPIs. Reduction in Mean Time to Detection and Response (MTTD & MTTR). Mandatory Skill Sets Certified SOC Analyst (EC-Council), Computer Hacking Forensic Investigator (EC-Council), Certified Ethical Hacker (EC-Council), CompTIA Security+, CompTIA CySA+ (Cybersecurity Analyst), GIAC Certified Incident Handler (GCIH) or equivalent. Product Certifications (Preferred): - Product Certifications on SOC Security Tools such as SIEM/Vulnerability Management/ DAM/UBA/ SOAR/NBA etc. Preferred Skill Sets SOC - Splunk Years Of Experience Required 2-5 Years Education Qualification B.Tech/MCA/MBA with IT background/ Bachelor’s degree in Information Technology, Cybersecurity, Computer Science a Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Business Administration, Bachelor of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills SOC Operations Optional Skills SoCs Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 3 days ago

Apply

0.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

About The Team The Rubrik Security Apps team helps customers secure their data on the cloud, SaaS and on-prem. Data is growing at an ever growing pace and so are the risks with cyber attacks targeted towards cloud data. We make it easy for businesses to protect, search, analyze all of their data simply and scalably. Security Apps is one of the fastest growing businesses and teams within Rubrik. We consider ourselves a startup inside a startup! Join us in this journey where there are ever widening avenues to explore, opportunities to innovate, deep engineering problems to tackle. We believe in fostering a culture with strong engineering values and teamwork as the key to building a great company and product. When you become part of our dynamic and innovative team, you'll contribute to the development of cutting-edge products within a fast-growing company that prides itself on an exceptional culture and a dynamic work environment. About The Role We are looking for a Software Engineer to join our team. We believe in giving engineers responsibility, not tasks. Our goal is to motivate and challenge people to do their best work. To do that, we have a very fluid structure and give people flexibility to work on projects that they enjoy the most. This develops more capable engineers, and keeps everyone engaged and happy. What You’ll Do Design, develop, test, deploy, maintain and improve the software Manage individual projects priorities, deadlines and deliverables with your technical expertise Identify and solve for bottlenecks within our software stack Bring innovation to the product Experience You’ll Need Bachelor’s or Master’s degree or equivalent in computer science or related field 0-1 years of relevant work experience Proficiency in one or more general purpose programming languages like Java, C/C++, Scala, Python, C# Experience with Google Cloud Platform/AWS/Azure or other public cloud technologies is a plus Experience working with two or more from the following: Unix/Linux environments, Windows environments, distributed systems, networking, developing large software systems, file systems, storage systems, hypervisors, databases and/or security software development Join Us in Securing the World's Data Rubrik (NYSE: RBRK) is on a mission to secure the world’s data. With Zero Trust Data Security™, we help organizations achieve business resilience against cyberattacks, malicious insiders, and operational disruptions. Rubrik Security Cloud, powered by machine learning, secures data across enterprise, cloud, and SaaS applications. We help organizations uphold data integrity, deliver data availability that withstands adverse conditions, continuously monitor data risks and threats, and restore businesses with their data when infrastructure is attacked. Linkedin | X (formerly Twitter) | Instagram | Rubrik.com Inclusion @ Rubrik At Rubrik, we are dedicated to fostering a culture where people from all backgrounds are valued, feel they belong, and believe they can succeed. Our commitment to inclusion is at the heart of our mission to secure the world’s data. Our goal is to hire and promote the best talent, regardless of background. We continually review our hiring practices to ensure fairness and strive to create an environment where every employee has equal access to opportunities for growth and excellence. We believe in empowering everyone to bring their authentic selves to work and achieve their fullest potential. Our inclusion strategy focuses on three core areas of our business and culture: Our Company: We are committed to building a merit-based organization that offers equal access to growth and success for all employees globally. Your potential is limitless here. Our Culture: We strive to create an inclusive atmosphere where individuals from all backgrounds feel a strong sense of belonging, can thrive, and do their best work. Your contributions help us innovate and break boundaries. Our Communities: We are dedicated to expanding our engagement with the communities we operate in, creating opportunities for underrepresented talent and driving greater innovation for our clients. Your impact extends beyond Rubrik, contributing to safer and stronger communities. Equal Opportunity Employer/Veterans/Disabled Rubrik is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, or protected veteran status and will not be discriminated against on the basis of disability. Rubrik provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, sex, national origin, age, disability or genetics. In addition to federal law requirements, Rubrik complies with applicable state and local laws governing nondiscrimination in employment in every location in which the company has facilities. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training. Federal law requires employers to provide reasonable accommodation to qualified individuals with disabilities. Please contact us at hr@rubrik.com if you require a reasonable accommodation to apply for a job or to perform your job. Examples of reasonable accommodation include making a change to the application process or work procedures, providing documents in an alternate format, using a sign language interpreter, or using specialized equipment. EEO IS THE LAW NOTIFICATION OF EMPLOYEE RIGHTS UNDER FEDERAL LABOR LAWS

Posted 3 days ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Teamwork makes the stream work. Roku is changing how the world watches TV Roku is the #1 TV streaming platform in the U.S., Canada, and Mexico, and we've set our sights on powering every television in the world. Roku pioneered streaming to the TV. Our mission is to be the TV streaming platform that connects the entire TV ecosystem. We connect consumers to the content they love, enable content publishers to build and monetize large audiences, and provide advertisers unique capabilities to engage consumers. From your first day at Roku, you'll make a valuable - and valued - contribution. We're a fast-growing public company where no one is a bystander. We offer you the opportunity to delight millions of TV streamers around the world while gaining meaningful experience across a variety of disciplines. About the team The primary responsibility of the Content Management team is to develop and manage the Content Management System (CMS). This system processes all content showcased on the Roku Channel, including creating ingestion pipelines, collaborating with partners for content acquisition, processing metadata, and managing content selection. The team also ensures that all Roku personnel can seamlessly update metadata. The Content Management team collaborates closely with the Recommendation team to enhance content curation and personalized recommendations. The system is designed to be highly scalable, leveraging distributed architectures and machine learning algorithms. The team aims to build a next-generation platform by revamping, redesigning, and expanding existing systems. This initiative addresses scalability, and latency constraints, and accommodates a growing number of content providers and partners. About the role Roku pioneered TV streaming and continues to innovate and lead the industry. The Roku Channel has us well-positioned to help shape the future of streaming. Continued success relies on investing in the Roku Cloud TV Platform, so we deliver high quality streaming TV experience at a global scale. You will be part of the Roku Content Management System and Tools Engineering team, playing a key role in developing the next generation content management systems that drive content ingestion, selection, management, and curation workflows. These systems are vital for empowering critical functions like Search and Recommendation on the Roku Platform. Your projects will have a direct impact on millions of Roku users globally. Throughout, you'll collaborate with key stakeholders across various Roku engineering teams and take the lead in designing our content management system. The ideal candidate will have endless curiosity and can pair a global mindset with locally relevant execution. You should be a gritty problem solver and self-starter who can drive programs with the product and commercial teams within Roku and across external strategic partner organizations. The successful candidate will display a balance of hard and soft skills, including the ability to respond quickly to changing business needs. This is an excellent role for a senior professional who enjoys a high level of visibility, thrives on having a critical business impact, able to make critical decisions and is excited to work on a core content pipeline component which is crucial for many streaming components at Roku. What you’ll be doing Design and implement highly scalable, and reliable web scale applications, tools and automation frameworks that power the Roku Content Management System Work closely with product management team, content management services, and other internal product engineering teams to contribute towards evolving the Roku Content Management Systems and Tools Design and build data pipelines for batch, near-real-time, and real-time processing Translate functional specifications into logical, component-based technical designs Write and review code, evaluate architectural tradeoffs for performance and security for high performance Participate in architecture discussions, influence product roadmap, and take ownership and responsibility over new projects Manage individual project priorities, deadlines and deliverables with limited supervision We’re excited if you have Strong problem solving and analytical abilities 5+ years of professional experience as Software Engineer Proficiency in Java/Scala/Python Strong technical competency and experience in building high-performance and cloud based scalable micro-services. Experience with Microservice and event-driven architectures Experience with design and implementation of modern micro-services architectures and API frameworks (REST/JSON). Experience with cloud platforms: AWS (preferred), GCP, etc. Experience with NoSQL data storage technologies such as Cassandra, DynamoDB, Redis, etc. as well as RDMBS like Oracle or MySQL. Ability to handle periodic on-call duty as well as out-of-band requests; strong written and verbal communication skills Bachelor's Degree in Computer Science plus 5 years of experience or equivalent; Master's degree preferred. AI literacy and curiosity.You have either tried Gen AI in your previous work or outside of work or are curious about Gen AI and have explored it. Benefits Roku is committed to offering a diverse range of benefits as part of our compensation package to support our employees and their families. Our comprehensive benefits include global access to mental health and financial wellness support and resources. Local benefits include statutory and voluntary benefits which may include healthcare (medical, dental, and vision), life, accident, disability, commuter, and retirement options (401(k)/pension). Our employees can take time off work for vacation and other personal reasons to balance their evolving work and life needs. It's important to note that not every benefit is available in all locations or for every role. For details specific to your location, please consult with your recruiter. The Roku Culture Roku is a great place for people who want to work in a fast-paced environment where everyone is focused on the company's success rather than their own. We try to surround ourselves with people who are great at their jobs, who are easy to work with, and who keep their egos in check. We appreciate a sense of humor. We believe a fewer number of very talented folks can do more for less cost than a larger number of less talented teams. We're independent thinkers with big ideas who act boldly, move fast and accomplish extraordinary things through collaboration and trust. In short, at Roku you'll be part of a company that's changing how the world watches TV. We have a unique culture that we are proud of. We think of ourselves primarily as problem-solvers, which itself is a two-part idea. We come up with the solution, but the solution isn't real until it is built and delivered to the customer. That penchant for action gives us a pragmatic approach to innovation, one that has served us well since 2002. To learn more about Roku, our global footprint, and how we've grown, visit https://www.weareroku.com/factsheet. By providing your information, you acknowledge that you have read our Applicant Privacy Notice and authorize Roku to process your data subject to those terms.

Posted 3 days ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Accellor is looking for a Data Engineer with extensive experience in developing ETL processes using PySpark Notebooks and Microsoft Fabric, and supporting existing legacy SQL Server environments. The ideal candidate will possess a strong background in Spark-based development, demonstrate a high proficiency in SQL, and be comfortable working independently, collaboratively within a team, or leading other developers when required. Design, develop, and maintain ETL pipelines using PySpark Notebooks and Microsoft Fabric Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver efficient data solutions Migrate and integrate data from legacy SQL Server environments into modern data platforms Optimize data pipelines and workflows for scalability, efficiency, and reliability Provide technical leadership and mentorship to junior developers and other team members Troubleshoot and resolve complex data engineering issues related to performance, data quality, and system scalability Develop, maintain, and enforce data engineering best practices, coding standards, and documentation Conduct code reviews and provide constructive feedback to improve team productivity and code quality Support data-driven decision-making processes by ensuring data integrity, availability, and consistency across different platforms Requirements Bachelor's or Master's degree in Computer Science, Data Science, Engineering, or a related field Experience with Microsoft Fabric or similar cloud-based data integration platforms is a must Min 3 years of experience in data engineering, with a strong focus on ETL development using PySpark or other Spark-based tools Proficiency in SQL with extensive experience in complex queries, performance tuning, and data modeling Strong knowledge of data warehousing concepts, ETL frameworks, and big data processing Familiarity with other data processing technologies (e.g., Hadoop, Hive, Kafka) is an advantage Experience working with both structured and unstructured data sources Excellent problem-solving skills and the ability to troubleshoot complex data engineering issues Proven ability to work independently, as part of a team, and in leadership roles Strong communication skills with the ability to translate complex technical concepts into business terms Mandatory Skills Experience with Data lake, Data warehouse, Delta lake Experience with Azure Data Services, including Azure Data Factory, Azure Synapse, or similar tools Knowledge of scripting languages (e.g., Python, Scala) for data manipulation and automation Familiarity with DevOps practices, CI/CD pipelines, and containerization (Docker, Kubernetes) is a plus Benefits Exciting Projects: We focus on industries like High-Tech, communication, media, healthcare, retail and telecom. Our customer list is full of fantastic global brands and leaders who love what we build for them. Collaborative Environment: You Can expand your skills by collaborating with a diverse team of highly talented people in an open, laidback environment — or even abroad in one of our global canters. Work-Life Balance: Accellor prioritizes work-life balance, which is why we offer flexible work schedules, opportunities to work from home, and paid time off and holidays. Professional Development: Our dedicated Learning & Development team regularly organizes Communication skills training, Stress Management program, professional certifications, and technical and soft skill trainings. Excellent Benefits: We provide our employees with competitive salaries, family medical insurance, Personal Accident Insurance, Periodic health awareness program, extended maternity leave, annual performance bonuses, and referral bonuses.

Posted 3 days ago

Apply

5.0 years

0 Lacs

Bengaluru East, Karnataka, India

On-site

Our Company Changing the world through digital experiences is what Adobe’s all about. We give everyone—from emerging artists to global brands—everything they need to design and deliver exceptional digital experiences! We’re passionate about empowering people to create beautiful and powerful images, videos, and apps, and transform how companies interact with customers across every screen. We’re on a mission to hire the very best and are committed to creating exceptional employee experiences where everyone is respected and has access to equal opportunity. We realize that new ideas can come from everywhere in the organization, and we know the next big idea could be yours! Job Description Role Summary Digital Experience (DX) ( https://www.adobe.com/experience-cloud.html) is a USD 4B+ business serving the needs of enterprise businesses including 95%+ of fortune 500 organizations. Adobe Marketo Engage, within Adobe DX, the leading marketing automation platform, helps businesses engage customers effectively. It lets enterprises do effective engagement through various surfaces and touchpoints. We are looking for strong and passionate engineers to join our team as we scale the business by building the next gen products and contributing to our existing offerings. If you’re passionate about innovative technology, then we would be excited to talk to you! What You'll Do Collaborate with architects, product management and engineering teams to build solutions that increase the product's value. Develop technical specifications, prototypes and presentations to communicate your ideas. Proficient in emerging industry technologies and trends, and the ability to communicate that knowledge to the team and use it to influence product direction. Exceptional coding skill Write unit tests, ensuring code quality and code coverage. Ensure code is always checked in and ensure source control standards are followed. What you need to succeed 5+ years of experience in software development Expertise in Java, Spring Boot, Rest Services, MySQL or Postgres, MongoDB Good working knowledge of Azure ecosystem, Azure data factory. Good understanding of working with Cassandra, Solr, ElasticSearch, Snowflake Ambitious and not afraid to tackle unknowns, demonstrates a strong bias to action Knowledge in apache spark, scala is added advantage Strong interpersonal, analytical, problem-solving and conflict resolution skills Excellent speaking, writing, and presentation skills, as well as the ability to persuade, encourage, and empower others Bachelors/Masters in Computer Science or a related field. Adobe is proud to be an Equal Employment Opportunity employer. We do not discriminate based on gender, race or color, ethnicity or national origin, age, disability, religion, sexual orientation, gender identity or expression, veteran status, or any other applicable characteristics protected by law. Learn more. Adobe aims to make Adobe.com accessible to any and all users. If you have a disability or special need that requires accommodation to navigate our website or complete the application process, email accommodations@adobe.com or call (408) 536-3015.

Posted 3 days ago

Apply

7.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

• 7-9 years of experience with data analytics, data modeling, and database design. • 3+ years of coding and scripting (Python, Java, Scala) and design experience. • 3+ years of experience with Spark framework. • 5+ Experience with ELT methodologies and tools. • 5+ years mastery in designing, developing, tuning and troubleshooting SQL. • Knowledge of Informatica Power center and Informatica IDMC. • Knowledge of distributed, column- orientated technology to create high-performant database technologies like - Vertica, Snowflake. • Strong data analysis skills for extracting insights from financial data • Proficiency in reporting tools (e.g., Power BI, Tableau). The Ideal Qualifications Technical Skills: • Domain knowledge of Investment Management operations including Security Masters, Securities Trade and Recon Operations, Reference data management, and Pricing. • Familiarity with regulatory requirements and compliance standards in the investment management industry. • Experience with IBOR’s such as Blackrock Alladin, CRD, Eagle STAR (ABOR), Eagle Pace, and Eagle DataMart. • Familiarity with investment data platforms such as GoldenSource, FINBOURNE, NeoXam, RIMES, and JPM Fusion. Soft Skills: • Strong analytical and problem-solving abilities. • Exceptional communication and interpersonal skills. • Ability to influence and motivate teams without direct authority. • Excellent time management and organizational skills, with the ability to prioritize multiple initiatives. What to Expect as Part of our Team • Regular meetings with the Corporate Technology leadership team • Focused one-on-one meetings with your manager • Access to mentorship opportunities • Access to learning content on Degreed and other informational platforms

Posted 3 days ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

You will be joining our team as a Senior Data Scientist with expertise in Artificial Intelligence (AI) and Machine Learning (ML). The ideal candidate should possess a minimum of 5-7 years of experience in data science, focusing on AI/ML applications. You are expected to have a strong background in various ML algorithms, programming languages such as Python, R, or Scala, and data processing frameworks like Apache Spark. Proficiency in data visualization tools and experience in model deployment using Docker, Kubernetes, and cloud services will be essential for this role. Your responsibilities will include end-to-end AI/ML project delivery, from data processing to model deployment. You should have a good understanding of statistics, probability, and mathematical concepts used in AI/ML. Additionally, familiarity with big data tools, natural language processing techniques, time-series analysis, and MLOps will be advantageous. As a Senior Data Scientist, you are expected to lead cross-functional project teams and manage data science projects in a production setting. Your problem-solving skills, communication skills, and curiosity to stay updated with the latest advancements in AI and ML are crucial for success in this role. You should be able to convey technical insights clearly to diverse audiences and quickly adapt to new technologies. If you are an innovative, analytical, and collaborative team player with a proven track record in AI/ML project delivery, we invite you to apply for this exciting opportunity.,

Posted 3 days ago

Apply

4.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Our Company Changing the world through digital experiences is what Adobe’s all about. We give everyone—from emerging artists to global brands—everything they need to design and deliver exceptional digital experiences! We’re passionate about empowering people to create beautiful and powerful images, videos, and apps, and transform how companies interact with customers across every screen. We’re on a mission to hire the very best and are committed to creating exceptional employee experiences where everyone is respected and has access to equal opportunity. We realize that new ideas can come from everywhere in the organization, and we know the next big idea could be yours! What You'll Do This is an individual contributor position. Expectations will be on the below lines: Responsible for design, architecture & implementation of new features. Be responsible for all phases of engineering. From early specs, design/architecture, technology choice, development, unit-testing/integration automation, and deployment. Collaborate with architects, product management and other engineering teams to build the technical vision, and road map for the team. Build technical specifications, prototypes and presentations to communicate your ideas. Be well versed in emerging industry technologies and trends, and have the ability to communicate that knowledge to the team and use it to influence product direction. Orchestrate with team to develop a product or parts of a large product. Requirements B.Tech / M.Tech degree in Computer Science. 4+ years of experience in front end technologies such as React, Node.js etc, along with decent experience in backend programming in Java/Scala. Should have excellent computer science fundamentals and a good understanding of design, and performance of algorithms Knowledge and experience in cloud services - Azure and/or AWS and container orchestration platform - kubernates Good understanding of distributed systems. Decent experience in Java/Scala Programming Strong understanding of RESTful APIs and GraphQL. Proficient in modern JavaScript frameworks like React, Node.js, etc Experience in writing unit, integration, and end-to-end tests. Effective oral and written communication skills, with the ability to interact with customers and cross-functional teams. Excellent work ethics and highly motivated Adobe is proud to be an Equal Employment Opportunity and affirmative action employer. We do not discriminate based on gender, race or color, ethnicity or national origin, age, disability, religion, sexual orientation, gender identity or expression, veteran status, or any other applicable characteristics protected by law. Learn more. Adobe aims to make Adobe.com accessible to any and all users. If you have a disability or special need that requires accommodation to navigate our website or complete the application process, email accommodations@adobe.com or call (408) 536-3015. Adobe values a free and open marketplace for all employees and has policies in place to ensure that we do not enter into illegal agreements with other companies to not recruit or hire each other’s employees.

Posted 3 days ago

Apply

7.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Our Company Changing the world through digital experiences is what Adobe’s all about. We give everyone—from emerging artists to global brands—everything they need to design and deliver exceptional digital experiences! We’re passionate about empowering people to create beautiful and powerful images, videos, and apps, and transform how companies interact with customers across every screen. We’re on a mission to hire the very best and are committed to creating exceptional employee experiences where everyone is respected and has access to equal opportunity. We realize that new ideas can come from everywhere in the organization, and we know the next big idea could be yours! Our Company Changing the world through digital experiences is what Adobe’s all about. We give everyone—from emerging artists to global brands—everything they need to design and deliver exceptional digital experiences. We’re passionate about empowering people to craft beautiful and powerful images, videos, and apps, and transform how companies interact with customers across every screen. We’re on a mission to hire the very best and are committed to building exceptional employee experiences where everyone is respected and has access to equal opportunity. We realize that new ideas can come from everywhere in the organization, and we know the next big idea could be yours! Digital Experience (DX) (https://www.adobe.com/experience-cloud.html) is a USD 3B+ business serving the needs of enterprise businesses including 95%+ of fortune 500 organizations. Adobe Journey Optimizer (AJO) within DX provides a platform for designing cross-channel customer experiences and provides an environment for visual campaign orchestration, real time interaction management and cross channel execution. It is built natively on the Adobe Experience Platform and combines a unified, real-time customer profile, an API-first open framework, centralized offer decisioning, and artificial intelligence (AI) and machine learning (ML) for personalization and optimization. Beyond the usual responsibility of designing, developing, documenting, and thoroughly testing code, Computer Scientists @ Adobe would own features of varying complexity, which may require understanding interactions with other parts of the system, moderately sophisticated algorithms and good design judgment. We are looking for strong and passionate engineers to join our team as we scale the business by building the next gen products and contributing to our existing offerings. What You'll Do This is an individual contributor position. Expectations will be on the below lines: Responsible for design and architecture of new products. Work in full DevOps mode, be responsible for all phases of engineering. From early specs, design/architecture, technology choice, development, unit-testing/integration automation, and deployment. Collaborate with architects, product management and other engineering teams to build the technical vision, and road map for the team. Build technical specifications, prototypes and presentations to communicate your ideas. Be well versed in emerging industry technologies and trends, and have the ability to communicate that knowledge to the team and use it to influence product direction. Orchestrate with team to develop a product or parts of a large product. Requirements B.Tech / M.Tech degree in Computer Science from a premier institute. 7-9.5years of relevant experience in software development. Should have excellent computer science fundamentals and a good understanding of design, and performance of algorithms Proficient in Java/Scala Programming Proficient in writing code that is reliable, maintainable, secure, and performant Knowledge of Azure services and/or AWS. Internal Opportunities We’re glad that you’re pursuing career development opportunities at Adobe. Here’s what you’ll need to do: Apply with your complete LinkedIn profile or resume/CV. Schedule a Check-in meeting with your manager to discuss this internal opportunity and your career aspirations. Check-ins should include ongoing discussions about expectations, feedback and career development. Learn more about Check-in here. Learn more about the internal career opportunities process in this FAQ. If you’re contacted for an interview, here are some tips. At Adobe, you will be immersed in an exceptional work environment that is recognized throughout the world on Best Companies lists. You will also be surrounded by colleagues who are committed to helping each other grow through our unique Check-In approach where ongoing feedback flows freely. If you’re looking to make an impact, Adobe's the place for you. Discover what our employees are saying about their career experiences on the Adobe Life blog and explore the meaningful benefits we offer. Adobe is an equal opportunity employer. We welcome and encourage diversity in the workplace regardless of gender, race or color, ethnicity or national origin, age, disability, religion, sexual orientation, gender identity or expression, or veteran status. Adobe is proud to be an Equal Employment Opportunity employer. We do not discriminate based on gender, race or color, ethnicity or national origin, age, disability, religion, sexual orientation, gender identity or expression, veteran status, or any other applicable characteristics protected by law. Learn more. Adobe aims to make Adobe.com accessible to any and all users. If you have a disability or special need that requires accommodation to navigate our website or complete the application process, email accommodations@adobe.com or call (408) 536-3015.

Posted 3 days ago

Apply

10.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Our Company Changing the world through digital experiences is what Adobe’s all about. We give everyone—from emerging artists to global brands—everything they need to design and deliver exceptional digital experiences! We’re passionate about empowering people to create beautiful and powerful images, videos, and apps, and transform how companies interact with customers across every screen. We’re on a mission to hire the very best and are committed to creating exceptional employee experiences where everyone is respected and has access to equal opportunity. We realize that new ideas can come from everywhere in the organization, and we know the next big idea could be yours! About Connect Adobe Connect, within Adobe DALP BU is one of the best online webinar and training delivery platform. The product has a huge customer base which has been using it for many years. The product has evolved magnificently over a period of time ensuring it stay on top of the latest tech stack. It offers opportunity to look at plethora of technologies on both client and server side. What You’ll Do: Hands-on Machine Learning Engineer who will release models in production. Develop classifiers, predictive models, and multi-variate optimization algorithms on large-scale datasets using advanced statistical modeling, machine learning, and data mining. Special focus on R&D that will be building predictive models for conversion optimization, Bidding algorithms for pacing & optimization, Reinforcement learning problems, and Forecasting. Collaborate with Product Management to bring AI-based Assistive experiences to life. Socialize what’s possible now or in the near future to inform the roadmap. Responsible for driving all aspects of ML product development: ML modeling, data/ML pipelines, quality evaluations, productization, and ML Ops. Create and instill a team culture that focuses on sound scientific processes and encourages deep engagement with our customers. Handle project scope and risks with data, analytics, and creative problem-solving. What you require: Solid foundation in machine learning, classifiers, statistical modeling and multivariate optimization techniques Experience with control systems, reinforcement learning problems, and contextual bandit algos. Experience with DNN frameworks like TensorFlow or PyTorch on large-scale data sets. TensorFlow, R, scikit, pandas Proficient in one or more: Python, Java/Scala, SQL, Hive, Spark Good to have - Git, Docker, Kubernetes GenAI, RAG pipelines a must have technology Cloud based solutions is good to have General understanding of data structures, algorithms, multi-threaded programming, and distributed computing concepts Ability to be a self-starter and work closely with other data scientists and software engineers to design, test, and build production-ready ML and optimization models and distributed algorithms running on large-scale data sets. Ideal Candidate Profile: A total of 10+ years of experience, including at least 5 years in technical roles involving Data Science, Machine Learning, or Statistics. Masters or B.Tech in Computer Science/ Statistics Comfort with ambiguity, adaptability to evolving priorities, and the ability to lead a team while working autonomously. Proven management experience with highly diverse and global teams. Demonstrated ability to influence technical and non-technical stakeholders. Proven ability to effectively manage in a high-growth, matrixed organization. Track record of delivering cloud-scale, data-driven products, and services that are widely adopted with large customer bases. An ability to think strategically, look around corners, and create a vision for the current quarter, the year, and five years down the road. A relentless pursuit of great customer experiences and continuous improvements to the product. Adobe is proud to be an Equal Employment Opportunity employer. We do not discriminate based on gender, race or color, ethnicity or national origin, age, disability, religion, sexual orientation, gender identity or expression, veteran status, or any other applicable characteristics protected by law. Learn more. Adobe aims to make Adobe.com accessible to any and all users. If you have a disability or special need that requires accommodation to navigate our website or complete the application process, email accommodations@adobe.com or call (408) 536-3015.

Posted 3 days ago

Apply

5.0 - 7.0 years

0 Lacs

India

On-site

Shift Ahead Technologies, based in Pune required a couple of Senior Engineers (5 to 7 years) experience in SCALA development , self-sufficient, that can work autonomously under customer supervision. This role shall be work from home. Should be willing to join in 7 odd days. Desirably excellent english communication and can work independently with client. Design, develop, and maintain Scala-based applications and software solutions. Write clean, efficient, and scalable code following functional programming principles and best practices Participate in architectural decisions and contribute to the design and development process of projects. Test, debug, and optimize applications to ensure high performance, security, and scalability Collaborate with cross-functional teams including developers, analysts, QA engineers, and stakeholders throughout the development cycle Collaborate with cross-functional teams including developers, analysts, QA engineers, and stakeholders throughout the development cycle Integrate Scala solutions with other platforms, frameworks (such as Akka, Play, or Spark), and APIs for data or service integration Confident candidates may apply or mail to careers@shiftahead.tech

Posted 3 days ago

Apply

3.0 - 7.0 years

0 Lacs

kolkata, west bengal

On-site

Genpact (NYSE: G) is a global professional services and solutions firm committed to delivering outcomes that shape the future. With over 125,000 employees spread across more than 30 countries, we are fueled by our innate curiosity, entrepreneurial agility, and the aspiration to create lasting value for our clients. Driven by our purpose - the relentless pursuit of a world that works better for people - we cater to and transform leading enterprises, including the Fortune Global 500, leveraging our profound business and industry expertise, digital operations services, and proficiency in data, technology, and AI. We are currently seeking applications for the position of Lead Consultant-Data Bricks Senior Engineer! As a Lead Consultant-Data Bricks Senior Engineer, your responsibilities will include working closely with Software Designers to ensure adherence to best practices, providing suggestions for enhancing code proficiency and maintainability, occasional customer interaction to analyze user needs and determine technical requirements, designing, building, and maintaining scalable and reliable data pipelines using DataBricks, developing high-quality code focusing on performance, scalability, and security, collaborating with cross-functional teams to comprehend data requirements and deliver solutions aligning with business needs, implementing data transformations and intricate algorithms within the DataBricks environment, optimizing data processing and refining data architecture to enhance system efficiency and data quality, mentoring junior engineers, and contributing to the establishment of best practices within the team. Additionally, staying updated with emerging trends and technologies in data engineering and cloud computing is imperative. Qualifications we are looking for: Minimum Qualifications: - Experience in data engineering or a related field - Strong hands-on experience with DataBricks, encompassing development of code, pipelines, and data transformations - Proficiency in at least one programming language (e.g., Python, Scala, Java) - In-depth knowledge of Apache Spark and its integration within DataBricks - Experience with cloud services (AWS, Azure, or GCP) and their data-related products - Familiarity with CI/CD practices, version control (Git), and automated testing - Exceptional problem-solving abilities with the capacity to work both independently and as part of a team - Bachelor's degree in computer science, Engineering, Mathematics, or a related technical field If you are enthusiastic about leveraging your skills and expertise as a Lead Consultant-Data Bricks Senior Engineer, join us at Genpact and be a part of shaping a better future for all. Location: India-Kolkata Schedule: Full-time Education Level: Bachelor's / Graduation / Equivalent Job Posting: Jul 30, 2024, 5:05:42 AM Unposting Date: Jan 25, 2025, 11:35:42 PM,

Posted 3 days ago

Apply

5.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Job Profile: Data Engineer Experience: 5+ Years Who we are: Innovatics is a place where innovation blends with analytics. We, Innovatics, take pride in knowing the notion of bleeding-edge technologies, strategic business moves, and radiant business transformation. We deliver never thought before business growth opportunities and assist businesses to accelerate their digital transformation journey. About the role: We're looking for a Data Engineer who's passionate about delivering tangible results, who has a positive attitude, and who enjoys solving problems. Requirements: Technical Skills: 3+ years of experience in a Data Engineer role, Experience with object-oriented/object function scripting languages: Python, Scala, Golang, Java, etc. Experience with Big data tools such as Spark, Hadoop/ Kafka/ Airflow/Hive Experience with Streaming data: Spark/Kinesis/Kafka/Pubsub/Event Hub Experience with GCP/Azure data factory/AWS Strong in SQL Scripting Experience with ETL tools Knowledge of Snowflake Data Warehouse Knowledge of Orchestration frameworks: Airflow/Luig Good to have knowledge of Data Quality Management frameworks Good to have knowledge of Master Data Management Self-learning abilities are a must Familiarity with upcoming new technologies is a strong plus. Should have a bachelor's degree in big data analytics, computer engineering, or a related field Personal Competency: Strong communication skills is a MUST Self-motivated, detail-oriented Strong organizational skills Ability to prioritize workloads and meet deadlines

Posted 3 days ago

Apply

2.0 - 9.0 years

0 Lacs

karnataka

On-site

We are seeking a Data Architect / Sr. Data and Pr. Data Architects to join our team. In this role, you will be involved in a combination of hands-on contribution, customer engagement, and technical team management. As a Data Architect, your responsibilities will include designing, architecting, deploying, and maintaining solutions on the MS Azure platform using various Cloud & Big Data Technologies. You will be managing the full life-cycle of Data Lake / Big Data solutions, starting from requirement gathering and analysis to platform selection, architecture design, and deployment. It will be your responsibility to implement scalable solutions on the Cloud and collaborate with a team of business domain experts, data scientists, and application developers to develop Big Data solutions. Moreover, you will be expected to explore and learn new technologies for creative problem solving and mentor a team of Data Engineers. The ideal candidate should possess strong hands-on experience in implementing Data Lake with technologies such as Data Factory (ADF), ADLS, Databricks, Azure Synapse Analytics, Event Hub & Streaming Analytics, Cosmos DB, and Purview. Additionally, experience with big data technologies like Hadoop (CDH or HDP), Spark, Airflow, NiFi, Kafka, Hive, HBase, MongoDB, Neo4J, Elastic Search, Impala, Sqoop, etc., is required. Proficiency in programming and debugging skills in Python and Scala/Java is essential, with experience in building REST services considered beneficial. Candidates should also have experience in supporting BI and Data Science teams in consuming data in a secure and governed manner, along with a good understanding of using CI/CD with Git, Jenkins / Azure DevOps. Experience in setting up cloud-computing infrastructure solutions, hands-on experience/exposure to NoSQL Databases, and Data Modelling in Hive are all highly valued. Applicants should have a minimum of 9 years of technical experience, with at least 5 years on MS Azure and 2 years on Hadoop (CDH/HDP).,

Posted 3 days ago

Apply

4.0 - 8.0 years

0 Lacs

hyderabad, telangana

On-site

As an Automation QA Engineer (Python & Scala) at our global IT leader client based in Hyderabad, you will be responsible for designing and developing automation testing frameworks using web and object-oriented technologies. With a focus on ensuring product quality, you will conduct functional and automation QA processes. Your expertise in Python or Java programming will be crucial in leveraging strong object-oriented programming skills to achieve project goals. Collaboration with team members to meet specific timelines is essential for success in this role. Ideally, you should possess 4 to 6 years of experience, with at least 2 years in automation or development being preferred. The offered salary for this position is 30 LPA, and the notice period required is immediate to 15 days. The option for relocation is available for interested candidates. Candidates with exposure to Big Data processing systems, Hadoop, or Scala will have an added advantage. Strong analytical and testing skills are essential for this role to excel in the fast-paced environment of our client's IT consulting, business process management, and digital transformation services.,

Posted 3 days ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

You will be responsible for developing ETL processes using Informatica for data integration from various sources. This includes creating specifications, test scripts, and ensuring code coverage for all integrations. You will also support the migration of integration code from lower to higher environments, such as production. Your experience in working with XML and JSON for real-time integrations will be crucial for this role. Additionally, you should have expertise in performing full and incremental ETL using Informatica Power Center. Experience with AWS Cloud services and working with iPaaS for integration configurations is required. It is essential to have a strong background in developing ETL processes for Data Warehouse Integration and supporting integration configurations through connected apps or web services. Familiarity with reporting tools, especially MicroStrategy, is preferred. You should also have experience in production support and be willing to be on-call during selected off-shift hours. Experience with Agile framework is necessary for this position. Additionally, knowledge of Python for data extraction and manipulation, AWS terraform, New Relic setup and maintenance, GIT, Rally, and Scala would be advantageous.,

Posted 3 days ago

Apply

4.0 years

0 Lacs

Pune, Maharashtra

Remote

Job Description What you will do: The Zendesk Core Services Packaging and Consumption team is looking for a Senior Software Engineer - Backend for a project that drives successful feature adoption for Zendesk customers. The ideal candidate will have experience with analysing various data sources with good SQL skills, a good understanding of domain driven design and the willingness to explore the unknowns.  On a day-to-day basis, a strong command over one of the backend languages like Scala or Java is highly beneficial. Past experience with developing on Rails Framework will be good as well. Your responsibilities will include: Collaborating with product management, architecture, engineers (front end and back end) to design beautifully simple solutions to complicated problems. You will be relied on from concept through development, QA, staging, and production deployment. Ensuring delivery on commitments. It is your responsibility to ensure code quality, debug code, and seek guidance to unblock tasks pending. Following best practices in all our frameworks and tools. Championing best practices and proper test coverage. We ship code frequently and fast, but stability and reliability must never be compromised. Actively participate in code reviews and design discussions Partner across all areas of the SDLC, including requirements gathering, requirements analysis, building services and solutions. Work across teams and organization boundaries and timezones to standardize and integrate services, tools, and workflows What you bring to the role:  4+ years of relevant experience in at least one object oriented language like Scala or Java (Scala preferred and should have hands on experience) Experience with databases like MySQL and/or DynamoDB  Analytical mindset, good articulation skills and pragmatic approach to problem solving. Experience with CI/CD and delivery systems (Github Actions, Jenkins) Knowledge of API design, distributed systems and Kafka Experience using Datadog or other log aggregation tools Customer-first mentality when dealing with service incident management, data analysis, and root-cause analysis A hunger for learning new technologies and an eagerness to grow your knowledge and capabilities A team-first collaborative attitude that thrives in a fast-moving agile development environment Excellent written and verbal communication skills Bonus Skills: Experience with JavaScript/Typescript Experience working on SaaS-based products Experience with AWS stack (ie. Aurora), Datawarehouse technologies like Snowflake Experience with Ruby on Rails. Please note that Zendesk can only hire candidates who are physically located and plan to work from Karnataka or Maharashtra. Please refer to the location posted on the requisition for where this role is based. Hybrid: In this role, our hybrid experience is designed at the team level to give you a rich onsite experience packed with connection, collaboration, learning, and celebration - while also giving you flexibility to work remotely for part of the week. This role must attend our local office for part of the week. The specific in-office schedule is to be determined by the hiring manager. The intelligent heart of customer experience Zendesk software was built to bring a sense of calm to the chaotic world of customer service. Today we power billions of conversations with brands you know and love. Zendesk believes in offering our people a fulfilling and inclusive experience. Our hybrid way of working, enables us to purposefully come together in person, at one of our many Zendesk offices around the world, to connect, collaborate and learn whilst also giving our people the flexibility to work remotely for part of the week. Zendesk is an equal opportunity employer, and we’re proud of our ongoing efforts to foster global diversity, equity, & inclusion in the workplace. Individuals seeking employment and employees at Zendesk are considered without regard to race, color, religion, national origin, age, sex, gender, gender identity, gender expression, sexual orientation, marital status, medical condition, ancestry, disability, military or veteran status, or any other characteristic protected by applicable law. We are an AA/EEO/Veterans/Disabled employer. If you are based in the United States and would like more information about your EEO rights under the law, please click here . Zendesk endeavors to make reasonable accommodations for applicants with disabilities and disabled veterans pursuant to applicable federal and state law. If you are an individual with a disability and require a reasonable accommodation to submit this application, complete any pre-employment testing, or otherwise participate in the employee selection process, please send an e-mail to peopleandplaces@zendesk.com with your specific accommodation request.

Posted 3 days ago

Apply

5.0 - 9.0 years

0 Lacs

haryana

On-site

The role of Lead, Software Engineer at Mastercard involves playing a crucial part in the Data Unification process across different data assets to create a unified view of data from multiple sources. This position will focus on driving insights from available data sets and supporting the development of new data-driven cyber products, services, and actionable insights. The Lead, Software Engineer will collaborate with various teams such as Product Manager, Data Science, Platform Strategy, and Technology to understand data needs and requirements for delivering data solutions that bring business value. Key responsibilities of the Lead, Software Engineer include performing data ingestion, aggregation, and processing to derive relevant insights, manipulating and analyzing complex data from various sources, identifying innovative ideas and delivering proof of concepts, prototypes, and proposing new products and enhancements. Moreover, integrating and unifying new data assets to enhance customer value, analyzing transaction and product data to generate actionable recommendations for business growth, and collecting feedback from clients, development, product, and sales teams for new solutions are also part of the role. The ideal candidate for this position should have a good understanding of streaming technologies like Kafka and Spark Streaming, proficiency in programming languages such as Java, Scala, or Python, experience with Enterprise Business Intelligence Platform/Data platform, strong SQL and higher-level programming skills, knowledge of data mining and machine learning algorithms, and familiarity with data integration tools like ETL/ELT tools including Apache NiFi, Azure Data Factory, Pentaho, and Talend. Additionally, they should possess the ability to work in a fast-paced, deadline-driven environment, collaborate effectively with cross-functional teams, and articulate solution requirements for different groups within the organization. It is essential for all employees working at or on behalf of Mastercard to adhere to the organization's security policies and practices, ensure the confidentiality and integrity of accessed information, report any suspected information security violations or breaches, and complete all mandatory security trainings in accordance with Mastercard's guidelines. The Lead, Software Engineer role at Mastercard offers an exciting opportunity to contribute to the development of innovative data-driven solutions that drive business growth and enhance customer value proposition.,

Posted 3 days ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

We are seeking experienced and talented engineers to join our team. Your main responsibilities will include designing, building, and maintaining the software that drives the global logistics industry. WiseTech Global is a leading provider of software for the logistics sector, facilitating connectivity for major companies like DHL and FedEx within their supply chains. Our organization is product and engineer-focused, with a strong commitment to enhancing the functionality and quality of our software through continuous innovation. Our primary Research and Development center in Bangalore plays a pivotal role in our growth strategies and product development roadmap. As a Lead Software Engineer, you will serve as a mentor, a leader, and an expert in your field. You should be adept at effective communication with senior management while also being hands-on with the code to deliver effective solutions. The technical environment you will work in includes technologies such as C#, Java, C++, Python, Scala, Spring, Spring Boot, Apache Spark, Hadoop, Hive, Delta Lake, Kafka, Debezium, GKE (Kubernetes Engine), Composer (Airflow), DataProc, DataStreams, DataFlow, MySQL RDBMS, MongoDB NoSQL (Atlas), UIPath, Helm, Flyway, Sterling, EDI, Redis, Elastic Search, Grafana Dashboard, and Docker. Before applying, please note that WiseTech Global may engage external service providers to assess applications. By submitting your application and personal information, you agree to WiseTech Global sharing this data with external service providers who will handle it confidentially in compliance with privacy and data protection laws.,

Posted 3 days ago

Apply

3.0 - 7.0 years

0 Lacs

chennai, tamil nadu

On-site

Cloud Kinetics is seeking a candidate with expertise in Bigdata, Hadoop, Hive SQLs, Spark, and other tools within the Bigdata Eco System. As a member of our team, you will be responsible for developing code, optimizing queries for performance, setting up environments, ensuring connectivity, and deploying code into production post-testing. Strong functional and technical knowledge is essential to fulfill project requirements, particularly in the context of Banking terminologies. Additionally, you may lead small to medium-sized projects and act as the primary contact for related tasks. Proficiency in DevOps and Agile Development Framework is crucial for this role. In addition to the core requirements, familiarity with Cloud computing, particularly AWS or Azure Cloud Services, is advantageous. The ideal candidate will possess strong problem-solving skills, adaptability to ambiguity, and a quick grasp of new and complex concepts. Experience in collaborating with teams within complex organizational structures is preferred. Knowledge of BI tools like MSTR and Tableau, as well as a solid understanding of object-oriented programming and HDFS concepts, will be beneficial. As a member of the team, your responsibilities will include working as a developer in Bigdata, Hadoop, or Data Warehousing Tools, and Cloud Computing. This entails working on Hadoop, Hive SQLs, Spark, and other tools within the Bigdata Eco System. Furthermore, you will create Scala/Spark jobs for data transformation and aggregation, develop unit tests for Spark transformations and helper methods, and design data processing pipelines to streamline operations. If you are a proactive individual with a strong technical background and a passion for leveraging cutting-edge technologies to drive innovation, we encourage you to apply for this exciting opportunity at Cloud Kinetics.,

Posted 3 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies