Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
2.0 - 6.0 years
8 - 13 Lacs
Pune
Hybrid
Job Title: Scala developer Employment Type: Full-time Experience Level: 2-6Years Location: Pune - Kalyani Nagar Shift Timings: General Shift About Cybage Cybage Software is a global technology consulting organization headquartered in Pune. With over 7,000 skilled professionals, we deliver dependable and seamless services to clients across the globe. Our presence spans across GNR and Hyderabad in India, and internationally in the USA, UK, Germany, Ireland, Japan, Canada, Australia, and Singapore. We work with a wide range of industries including Media & Advertising, Travel & Hospitality, Digital Retail, Healthcare & Life Sciences, Supply Chain & Logistics, and Technology. About the Role We are looking for a Scala developer to take ownership of the design and development of high-performance, scalable applications. The ideal candidate should have strong hands-on experience with Scala and Akka, along with a solid understanding of reactive systems and domain-driven design principles. You will lead development efforts and mentor a team of engineers, ensuring the delivery of robust solutions in a collaborative environment. Key Responsibilities Drive the implementation of functional programming concepts and best practices. Design and maintain reactive, event-driven systems with Akka Streams and Akka HTTP. Mentor junior team members and provide technical leadership throughout the SDLC. Collaborate with DevOps and QA to ensure CI/CD, testing, and deployment standards are met. Write clean, maintainable code and ensure best practices are followed. Required Skills and Qualifications 2-6 years of experience in software development, with at least 2 years in Scala. Solid experience with Akka Streams, Akka HTTP, and functional programming. Good understanding of domain-driven design and reactive systems. Proven ability to lead technical discussions and guide development teams. Experience with CI/CD pipelines, code quality tools, and modern development workflows. Familiarity with cloud environments (AWS, Azure, or GCP) and containerization (Docker, Kubernetes). Preferred Qualifications Strong communication and collaboration skills. Academic Performance: Minimum 60% in any two of the following: Secondary, Higher Secondary, and Graduation. Minimum 55% in the third.
Posted 1 week ago
15.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Description Publicis Sapient is looking for a Principal Data Scientist to join its Data Science practice. The role is to not only be a trusted advisor to our clients for driving the next generation innovation in applied machine learning and statistical analysis, but also a leader in advancing the group’s capabilities into the future. As part of the team, you will be responsible for leading teams that create data driven solutions that at the core are driven by relevant learning algorithms. In this role you will educate internal and external teams on conceptual models for problem solving in the machine learning realm and help translate goals and objectives into data driven solutions. You will enjoy working with some of the most diverse data sets in the world, cutting edge technology, and the ability to see your insights turned into real business results on a regular basis. The role is critical in helping advance the application of machine learning as a core building block to core market offerings in eCommerce, advertising, AdTech and business transformation. In addition, you will be responsible for directing analysis that informs and improves the effectiveness of the planning, execution and optimization of marketing tactics. As an evangelist for data science, you will partner with leaders in various Publicis Sapient divisions, industries and geographies, to ensure that increasingly more solutions we bring to the market are data driven and are supported by a strong data sciences group. Core areas of focus for this group includes applications in customer segmentations, media and advertising optimization solutions, developing recommender systems, fraud analytics, personalization systems and forecasting. Your Impact Design and implement high performance and robust analytical models in support of product and project objectives Research and bring innovations to develop next generation solutions in core functional areas related to digital marketing & customer experience solution blocks - Content and Commerce, AdTech, Customer Relationship Management (CRM), Campaign Management Provide technical thought leadership, coaching and mentorship in the field of data science in working with engineering and other cross-functional teams. Help enhance ML ops platform to deliver some cutting-edge Generative AI propositions for multiple industry like BFSI, Retail and healthcare. Evolve the approach for the application of machine learning to existing program and project disciplines. Design controlled experiments to measure changes to the new user experience Segment customers and markets to improve targeting and messaging of product recommendations and offers Direct research and evaluation for open source and vendor solutions in the analytics platforms space to guide solutions Be responsible for solution and code quality, including providing detailed and constructive design and code reviews Help establish standards in machine learning and statistical analysis to ensure consistency in quality across projects and teams and identify relevant process efficiencies Assess client needs and requirements to ensure your team is adopting the appropriate approach to solve client challenges. Qualifications Your Skills & Experience: Ph.D in Computer Science, Math, Physics, Engineering, Statistics or other quantitative or computational field. Advanced degrees preferred 15+ years in the field of applying methods in statistical learning in developing data driven solutions preferably in the eCommerce and Adtech domain Strong understanding of Gen AI tools, and frameworks . finetuning LLM for different domains and basic understanding of LLM ops. Demonstrate proficiency with various approaches in regression, classification, and cluster analysis Must have experience in statistical programming in R, SAS, SPSS, MATLAB or Python Expertise in one or more programming languages Python, R, Scala Expertise in SQL programming languages and familiarity with Hive, PIG Benefits of Working Here: Access the regional benefits document & populate your region’s benefits below. A Tip From The Hiring Manager Ideal candidates will have prior experience in tradition AI however recent experience should be in Gen AI ,Agentic AI etc. This person should be highly organized, adapt quickly to change and hands-on at code. Additional Information Gender-Neutral Policy 18 paid holidays throughout the year. Generous parental leave and new parent transition program Flexible work arrangements Employee Assistance Programs to help you in wellness and well being Company Description Publicis Sapient is a digital transformation partner helping established organizations get to their future, digitally enabled state, both in the way they work and the way they serve their customers. We help unlock value through a start-up mindset and modern methods, fusing strategy, consulting and customer experience with agile engineering and problem-solving creativity. United by our core values and our purpose of helping people thrive in the brave pursuit of next, our 20,000+ people in 53 offices around the world combine experience across technology, data sciences, consulting and customer obsession to accelerate our clients’ businesses through designing the products and services their customers truly value. Show more Show less
Posted 1 week ago
8.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Teamwork makes the stream work. Roku is changing how the world watches TV Roku is the #1 TV streaming platform in the U.S., Canada, and Mexico, and we've set our sights on powering every television in the world. Roku pioneered streaming to the TV. Our mission is to be the TV streaming platform that connects the entire TV ecosystem. We connect consumers to the content they love, enable content publishers to build and monetize large audiences, and provide advertisers unique capabilities to engage consumers. From your first day at Roku, you'll make a valuable - and valued - contribution. We're a fast-growing public company where no one is a bystander. We offer you the opportunity to delight millions of TV streamers around the world while gaining meaningful experience across a variety of disciplines. About the Team Roku pioneered TV streaming and continues to innovate and lead the industry. Continued success relies on investing in the Roku Content Platform, so we deliver high quality streaming TV experience at a global scale. As part of our Content Platform team you join a small group of highly skilled engineers, that own significant responsibility in crafting, developing and maintaining our large-scale backend systems, data pipelines, storage, and processing services. We provide all insights in regard to all content on Roku Devices. About the Role We are looking for a Senior Software Engineer with vast experience in backend development, Data Engineering and Data Analytics to focus on building next level content platform and data intelligence, which empowers Search, Recommendation, and many more critical systems across Roku Platform. This is an excellent role for a senior professional who enjoys a high level of visibility, thrives on having a critical business impact, able to make critical decisions and is excited to work on a core data platform component which is crucial for many streaming components at Roku. What You’ll Be Doing Work closely with product management team, content data platform services, and other internal consumer teams to contribute extensively to our content data platform and underlying architecture. Build low-latency and optimized streaming and batch data pipelines to enable downstream services. Build and support our Micro-services based Event-Driven Backend Systems & Data Platform. Design and build data pipelines for batch, near-real-time, and real-time processing. Participate in architecture discussions, influence product roadmap, and take ownership and responsibility over new projects. We’re excited if you have 8+ years professional experience as a Software Engineer. Proficiency in Java/Scala/Python. Deep understanding of backend technologies, architecture patterns, and best practices, including microservices, RESTful APIs, message queues, caching, and databases. Strong analytical and problem-solving skills, data structures and algorithms, with the ability to translate complex technical requirements into scalable and efficient solutions. Experience with Micro-service and event-driven architectures. Experience with Apache Spark and Apache Flink. Experience with Big Data Frameworks and Tools: MapReduce, Hive, Presto, HDFS, YARN, Kafka, etc. Experience with Apache Airflow or similar workflow orchestration tooling for ETL. Experience with cloud platforms: AWS (preferred), GCP, etc. Strong communication and presentation skills. BS in Computer Science; MS in Computer Science preferred. Benefits Roku is committed to offering a diverse range of benefits as part of our compensation package to support our employees and their families. Our comprehensive benefits include global access to mental health and financial wellness support and resources. Local benefits include statutory and voluntary benefits which may include healthcare (medical, dental, and vision), life, accident, disability, commuter, and retirement options (401(k)/pension). Our employees can take time off work for vacation and other personal reasons to balance their evolving work and life needs. It's important to note that not every benefit is available in all locations or for every role. For details specific to your location, please consult with your recruiter. The Roku Culture Roku is a great place for people who want to work in a fast-paced environment where everyone is focused on the company's success rather than their own. We try to surround ourselves with people who are great at their jobs, who are easy to work with, and who keep their egos in check. We appreciate a sense of humor. We believe a fewer number of very talented folks can do more for less cost than a larger number of less talented teams. We're independent thinkers with big ideas who act boldly, move fast and accomplish extraordinary things through collaboration and trust. In short, at Roku you'll be part of a company that's changing how the world watches TV. We have a unique culture that we are proud of. We think of ourselves primarily as problem-solvers, which itself is a two-part idea. We come up with the solution, but the solution isn't real until it is built and delivered to the customer. That penchant for action gives us a pragmatic approach to innovation, one that has served us well since 2002. To learn more about Roku, our global footprint, and how we've grown, visit https://www.weareroku.com/factsheet. By providing your information, you acknowledge that you have read our Applicant Privacy Notice and authorize Roku to process your data subject to those terms. Show more Show less
Posted 1 week ago
12.0 - 16.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Modern Data Platform: Data Architect Your Opportunity You will lead the Modern Data Platform Practice, that would involve providing solutions to customers on Tradition Data warehouses, Big Data Platforms on Prem and Cloud. It would cover aspects of Architecting the Data Platforms, defining Data engineering design, choosing appropriate technology and tools across on-Prem and Cloud services. Help the organization strengthen the Modern Data Platform capabilities, lead the Pre-sales discussion on data platforms, provide the technology architecture in the RFP responses and lead the technology POC/MVP. Context : In modern banking age financial institutions need to bring Classical Data Drivers and Evolving Business Drivers together in a single platform. These drivers also need to communicate with each other and share the data products for enterprise consumption. Traditional data platforms are able to handle classical data drivers well but fail to communicate with evolving business drivers due to limitations of technologies and implementation approaches. Modern Data Platform helps to fill this gap, and take the business to the next level of growth and expansion using data driven approaches. The technology transformation in recent years make such implementations feasible. Your Opportunity You will be responsible for leading the Modern Data Platform Practice, that would involve providing solutions to customers on Tradition Data warehouses. Platforms on Prem and Cloud. It would cover aspects of Architecting the Data Platforms, defining Data engineering design, choosing appropriate technology and tools across on-Prem and Cloud services. Help the organization to strengthen the Modern Data Platform capabilities, lead the Pre-sales discussion on data platforms, provide the technology architecture in the RFP responses and lead the technology POC/MVP. Your Qualifications: We expect you to have following qualifications and experiences to be able to effectively perform the suggested role: A Technology leader with an engineering academic background in Computer Science / Information Technology / Data Technologies [BE/BTech/MCA] Overall 12-16 years of Data Engineering and analytics experience as individual contributor as well as Technology / Architecture lead A minimum of 5-7 years of hands-on experience in Big Data systems across On-Prem and Cloud environments Should have led Data Platform architecture & design projects for a mid to large size firms Have experience of implementing Batch Data and Streaming / Online data integrations using 3rd party tools and custom programs A good hands-on experience on SQL and one of the programming language: Core Java / Scala / Python. A good hands-on experience in Kafka for enabling Event driven data pipes / processing Knowledge of leading Data Services offered by AWS, Azure, Snowflake, Confluent Thorough understanding on distributed computing and related data structures Should have implemented Data Governance and Quality capabilities for a Data Platform (for On-Prem and or Cloud ) Good analytical skills and presentation skills Experience in building the team from the ground up Good exposure of leading RDBMS technologies and Data Visualization platforms Should have demonstrated AI/ML models for Data Processing and generating Insights for end users Good great teammate and ability to work on own initiatives with minimal direction Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Gurugram, Haryana, India
On-site
This role is for one of the Weekday's clients Salary range: Rs 2200000 - Rs 2400000 (ie INR 22-24 LPA) Min Experience: 5 years Location: Pune, Bengaluru, Chennai, Kolkata, Gurgaon JobType: full-time We are looking for an experienced Snowflake Developer to join our Data Engineering team. The ideal candidate will possess a deep understanding of Data Warehousing , SQL , ETL tools like Informatica , and visualization platforms such as Power BI . This role involves building scalable data pipelines, optimizing data architectures, and collaborating with cross-functional teams to deliver impactful data solutions. Requirements Key Responsibilities Data Engineering & Warehousing: Leverage over 5 years of hands-on experience in Data Engineering with a focus on Data Warehousing and Business Intelligence. Pipeline Development: Design and maintain ELT pipelines using Snowflake, Fivetran, and DBT to ingest and transform data from multiple sources. SQL Development: Write and optimize complex SQL queries and stored procedures to support robust data transformations and analytics. Data Modeling & ELT: Implement advanced data modeling practices including SCD Type-2, and build high-performance ELT workflows using DBT. Requirement Analysis: Partner with business stakeholders to capture data needs and convert them into scalable technical solutions. Data Quality & Troubleshooting: Conduct root cause analysis on data issues, maintain high data integrity, and ensure reliability across systems. Collaboration & Documentation: Collaborate with engineering and business teams. Develop and maintain thorough documentation for pipelines, data models, and processes. Skills & Qualifications Expertise in Snowflake for large-scale data warehousing and ELT operations. Strong SQL skills with the ability to create and manage complex queries and procedures. Proven experience with Informatica PowerCenter for ETL development. Proficiency with Power BI for data visualization and reporting. Hands-on experience with Fivetran for automated data integration. Familiarity with DBT, Sigma Computing, Tableau, and Oracle. Solid understanding of data analysis, requirement gathering, and source-to-target mapping. Knowledge of cloud ecosystems such as Azure (including ADF, Databricks); experience with AWS or GCP is a plus. Experience with workflow orchestration tools like Airflow, Azkaban, or Luigi. Proficiency in Python for scripting and data processing (Java or Scala is a plus). Bachelor's or Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or a related field. Key Tools & Technologies Snowflake, snowsql, Snowpark SQL, Informatica, Power BI, DBT Python, Fivetran, Sigma Computing, Tableau Airflow, Azkaban, Azure, Databricks, ADF Show more Show less
Posted 1 week ago
0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Join our Team About this opportunity: At Ericsson, we are offering a fantastic opportunity for a passionate and motivated Solution Architect to join our dynamic and diverse team. In this role, you will contribute to the design, construction, and management of Ericsson-based solutions. Familiarity with big data technologies, agile methodology and practices constitutes an integral part of the role. What you will do: Managing the overall operations of multiple solutions deployed within the customer environment. Customer engagement is essential to secure agreements on the proposed solutions. Prepare technical presentations, proposals, and conduct walkthroughs with customers. Lead the technical risk analysis and assist the Program Manager/Program Director in the overall risk analysis process. Manage internal and external stakeholders to identify and bridge gaps. Identify New Business Opportunities. Leading the delivery team by assigning tasks and reviewing progress. Lead User Acceptance Testing (UAT) for the Customer. Managing the L1, L2, L3, and CNS (Support) teams, as well as the customer's Operations and Maintenance (O&M) team. Identify scope creep and change requests during the delivery phase. Support Pre-Sales Activities Prepare Effort Estimation Lead Customer Presentations and Demonstrations Interface with third-party providers (3PP) and original equipment manufacturers (OEMs) to evaluate and integrate their solutions into Ericsson's offerings. Act as a Solution Lifecycle Manager for the proposed or implemented solution. Proactively develop competence in new solution areas within the domain and technologies. Mentor solution integrators, developers, and system architects, providing a transparent and open environment for growth and development. The skills you bring: Experience in architecting Large Size Products, Micro Service Architecture, Database Models Strong Experience in Development within the NMS/EMS Telecom Domain Understanding OSS/NMS-Related Standards Understanding and Experience in Telecommunications Technologies Experience in network management concepts, including inventory management, fault management, performance management, and configuration management. Experience with Network Management Protocols, including SNMP, XML, REST/JSON, TL1, and ASCII. Experience in Software Development Life Cycle Must be proficient in software architecture, application design, development, and implementation using the technologies below- Programming & Scripting -Java, Java Scripts, Shell, Python Big Data – Apache Spark, Scala Microservices CI/CD Containerization/Docker Database -Postgres, MySQL, Cassandra, Mongo Db, Elastic Search. Tools-Git, Maven, Gradle, Docker, Jenkins, JMeter, JIRA Why join Ericsson? At Ericsson, you´ll have an outstanding opportunity. The chance to use your skills and imagination to push the boundaries of what´s possible. To build solutions never seen before to some of the world’s toughest problems. You´ll be challenged, but you won’t be alone. You´ll be joining a team of diverse innovators, all driven to go beyond the status quo to craft what comes next. What happens once you apply? Click Here to find all you need to know about what our typical hiring process looks like. Encouraging a diverse and inclusive organization is core to our values at Ericsson, that's why we champion it in everything we do. We truly believe that by collaborating with people with different experiences we drive innovation, which is essential for our future growth. We encourage people from all backgrounds to apply and realize their full potential as part of our Ericsson team. Ericsson is proud to be an Equal Opportunity Employer. learn more. Primary country and city: India (IN) || Mumbai Req ID: 767286 Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
This role is for one of the Weekday's clients Salary range: Rs 2200000 - Rs 2400000 (ie INR 22-24 LPA) Min Experience: 5 years Location: Pune, Bengaluru, Chennai, Kolkata, Gurgaon JobType: full-time We are looking for an experienced Snowflake Developer to join our Data Engineering team. The ideal candidate will possess a deep understanding of Data Warehousing , SQL , ETL tools like Informatica , and visualization platforms such as Power BI . This role involves building scalable data pipelines, optimizing data architectures, and collaborating with cross-functional teams to deliver impactful data solutions. Requirements Key Responsibilities Data Engineering & Warehousing: Leverage over 5 years of hands-on experience in Data Engineering with a focus on Data Warehousing and Business Intelligence. Pipeline Development: Design and maintain ELT pipelines using Snowflake, Fivetran, and DBT to ingest and transform data from multiple sources. SQL Development: Write and optimize complex SQL queries and stored procedures to support robust data transformations and analytics. Data Modeling & ELT: Implement advanced data modeling practices including SCD Type-2, and build high-performance ELT workflows using DBT. Requirement Analysis: Partner with business stakeholders to capture data needs and convert them into scalable technical solutions. Data Quality & Troubleshooting: Conduct root cause analysis on data issues, maintain high data integrity, and ensure reliability across systems. Collaboration & Documentation: Collaborate with engineering and business teams. Develop and maintain thorough documentation for pipelines, data models, and processes. Skills & Qualifications Expertise in Snowflake for large-scale data warehousing and ELT operations. Strong SQL skills with the ability to create and manage complex queries and procedures. Proven experience with Informatica PowerCenter for ETL development. Proficiency with Power BI for data visualization and reporting. Hands-on experience with Fivetran for automated data integration. Familiarity with DBT, Sigma Computing, Tableau, and Oracle. Solid understanding of data analysis, requirement gathering, and source-to-target mapping. Knowledge of cloud ecosystems such as Azure (including ADF, Databricks); experience with AWS or GCP is a plus. Experience with workflow orchestration tools like Airflow, Azkaban, or Luigi. Proficiency in Python for scripting and data processing (Java or Scala is a plus). Bachelor's or Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or a related field. Key Tools & Technologies Snowflake, snowsql, Snowpark SQL, Informatica, Power BI, DBT Python, Fivetran, Sigma Computing, Tableau Airflow, Azkaban, Azure, Databricks, ADF Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Greater Kolkata Area
On-site
This role is for one of the Weekday's clients Salary range: Rs 2200000 - Rs 2400000 (ie INR 22-24 LPA) Min Experience: 5 years Location: Pune, Bengaluru, Chennai, Kolkata, Gurgaon JobType: full-time We are looking for an experienced Snowflake Developer to join our Data Engineering team. The ideal candidate will possess a deep understanding of Data Warehousing , SQL , ETL tools like Informatica , and visualization platforms such as Power BI . This role involves building scalable data pipelines, optimizing data architectures, and collaborating with cross-functional teams to deliver impactful data solutions. Requirements Key Responsibilities Data Engineering & Warehousing: Leverage over 5 years of hands-on experience in Data Engineering with a focus on Data Warehousing and Business Intelligence. Pipeline Development: Design and maintain ELT pipelines using Snowflake, Fivetran, and DBT to ingest and transform data from multiple sources. SQL Development: Write and optimize complex SQL queries and stored procedures to support robust data transformations and analytics. Data Modeling & ELT: Implement advanced data modeling practices including SCD Type-2, and build high-performance ELT workflows using DBT. Requirement Analysis: Partner with business stakeholders to capture data needs and convert them into scalable technical solutions. Data Quality & Troubleshooting: Conduct root cause analysis on data issues, maintain high data integrity, and ensure reliability across systems. Collaboration & Documentation: Collaborate with engineering and business teams. Develop and maintain thorough documentation for pipelines, data models, and processes. Skills & Qualifications Expertise in Snowflake for large-scale data warehousing and ELT operations. Strong SQL skills with the ability to create and manage complex queries and procedures. Proven experience with Informatica PowerCenter for ETL development. Proficiency with Power BI for data visualization and reporting. Hands-on experience with Fivetran for automated data integration. Familiarity with DBT, Sigma Computing, Tableau, and Oracle. Solid understanding of data analysis, requirement gathering, and source-to-target mapping. Knowledge of cloud ecosystems such as Azure (including ADF, Databricks); experience with AWS or GCP is a plus. Experience with workflow orchestration tools like Airflow, Azkaban, or Luigi. Proficiency in Python for scripting and data processing (Java or Scala is a plus). Bachelor's or Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or a related field. Key Tools & Technologies Snowflake, snowsql, Snowpark SQL, Informatica, Power BI, DBT Python, Fivetran, Sigma Computing, Tableau Airflow, Azkaban, Azure, Databricks, ADF Show more Show less
Posted 1 week ago
7.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About Company provides companies with innovative technology solutions for everyday business problems. Our passion is to help clients become intelligent, information-driven organizations, where fact-based decision-making is embedded into daily operations, which leads to better processes and outcomes Experience: 7 to 10 Years Qualifications Must Have: Experience in handling team members Proficiency in working with cloud platforms (AWS, Azure, GCP) Experience in SQL, NoSQL, and Data Modelling Experience in Python programming Experience in Design, Development, and Deployment of Data Architecture Experience 8+ years of experience in data engineering with hands-on expertise in data pipeline development, architecture, and system optimization Demonstrated success in managing global teams, especially across US and India time zones. Proven track record in leading data engineering teams and managing end-to-end project delivery. Strong background in data warehousing and familiarity with tools such as Matillion, dbt, Striim, etc. Hands-on experience of 2+ years in designing and developing data integration solutions using Matillion and/or dbt Expertise in architecting data solutions. Successfully implemented at least two end-to-end projects with multiple transformation layers. Technical Skills Lead the design, development, and deployment of scalable data architectures, pipelines, and processes tailored to client needs Expertise in programming languages such as Python, Scala, or Java. Proficiency in designing and delivering data pipelines in Cloud Data Warehouses (e.g., Snowflake, Redshift), using various ETL/ELT tools such as Matillion, dbt, Striim, etc. Solid understanding of database systems (relational and NoSQL) and data modeling techniques. Strong knowledge of data engineering and integration frameworks. Good grasp of coding standards, with the ability to define standards and testing strategies for projects. Proficiency in working with cloud platforms (AWS, Azure, GCP) and associated data services. Enthusiastic about working in Agile methodology. Possess a comprehensive understanding of the DevOps process including GitHub integration and CI/CD pipelines. Skills: integration,agile methodology,etl/elt tools (matillion, dbt, striim),ci/cd,data modeling,aws,ci/cd pipelines,data modelling,github,cloud platforms (aws, azure, gcp),data warehousing,striim,team management,data architecture design,data pipeline development,architecture,database systems,etl/elt tools,devops,scala,nosql,matillion,python,team leadership,data architecture design and development,data engineering,sql,redshift,data integration,dbt,java,pipelines,snowflake,devops processes,python programming,azure Show more Show less
Posted 1 week ago
6.0 years
0 Lacs
India
On-site
Overview Working at Atlassian Atlassians can choose where they work – whether in an office, from home, or a combination of the two. That way, Atlassians have more control over supporting their family, personal goals, and other priorities. We can hire people in any country where we have a legal entity. Interviews and onboarding are conducted virtually, a part of being a distributed-first company. Responsibilities What you'll do Build and ship features and capabilities daily in highly scalable, cross-geo distributed environment Be part of an amazing open and collaborative work environment with other experienced engineers, architects, product managers, and designers Review code with best practices of readability, testing patterns, documentation, reliability, security, and performance considerations in mind Mentor and level up the skills of your teammates by sharing your expertise in formal and informal knowledge sharing sessions Ensure full visibility, error reporting, and monitoring of high performing backend services Participate in Agile software development including daily stand-ups, sprint planning, team retrospectives, show and tell demo sessions Qualifications Your background 6+ years of experience building and developing backend applications Bachelor's or Master's degree with a preference for Computer Science degree Experience crafting and implementing highly scalable and performant RESTful micro-services Proficiency in any modern object-oriented programming language (e.g., Java, Kotlin, Go, Scala, Python, etc.) Fluency in any one database technology (e.g. RDBMS like Oracle or Postgres and/or NoSQL like DynamoDB or Cassandra) Real passion for collaboration and strong interpersonal and communication skills Broad knowledge and understanding of SaaS, PaaS, IaaS industry with hands-on experience of public cloud offerings (AWS, GAE, Azure) Familiarity with cloud architecture patterns and an engineering discipline to produce software with quality Our perks & benefits Atlassian offers a variety of perks and benefits to support you, your family and to help you engage with your local community. Our offerings include health coverage, paid volunteer days, wellness resources, and so much more. Visit go.atlassian.com/perksandbenefits to learn more. About Atlassian At Atlassian, we're motivated by a common goal: to unleash the potential of every team. Our software products help teams all over the planet and our solutions are designed for all types of work. Team collaboration through our tools makes what may be impossible alone, possible together. We believe that the unique contributions of all Atlassians create our success. To ensure that our products and culture continue to incorporate everyone's perspectives and experience, we never discriminate based on race, religion, national origin, gender identity or expression, sexual orientation, age, or marital, veteran, or disability status. All your information will be kept confidential according to EEO guidelines. To provide you the best experience, we can support with accommodations or adjustments at any stage of the recruitment process. Simply inform our Recruitment team during your conversation with them. To learn more about our culture and hiring process, visit go.atlassian.com/crh . Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Title : Azure Data Total IT Experience (in Yrs.)- 4 to 7 Location - Indore / Pune Relevant Experience Required (in Yrs.) 3+ years direct experience in analyzing and deriving source systems, data governance, metadata management, data architecture, data quality and metadata related output Strong experience in different type of “Data Analysis” covering business data, metadata, master data, analytical data. Language Requirement English Key words to search in resume Databricks, AZURE Data Factory Technical/Functional Skills -MUST HAVE SKILLS 3+ years hands-on experience with Databricks This role will be responsible for conducting assessment of the existing systems in the land scape Devise a strategy for SAS to Databricks migration activities Work out on a plan to perform the above said activities Work closely with customer on daily basis and present the progress made and the plan of action Interact with onsite and offshore cognizant associates to ensure that the project deliverables are on track Secondary Skills Data Management solutions with capabilities, such as Data Ingestion, Data Curation, Metadata and Catalog, Data Security, Data Modeling, Data Wrangling Responsibilities Hands on experience in installing, Configuring and using MS Azure Data bricks and Hadoop ecosystem components like DBFS, Parquet, Delta Tables, HDFS, Map Reduce programming, Kafka, Spark & Event Hub. In depth understanding of Spark Architecture including Spark Core, Spark SQL, Data Frames, Spark Streaming, RDD caching, Spark MLib. Hands on experience in Scripting languages like Scala & Python. Hands on experience in Analysis, Design, Coding & Testing phases of SDLC with best practices. Expertise in using Spark SQL with various data sources like JSON, Parquet and Key Value Pair. Experience in creating tables, partitioning, bucketing, loading and aggregating data using Spark SQL/Scala. Show more Show less
Posted 1 week ago
8.0 - 11.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Company Description About Sopra Steria Sopra Steria, a major Tech player in Europe with 50,000 employees in nearly 30 countries, is recognised for its consulting, digital services and solutions. It helps its clients drive their digital transformation and obtain tangible and sustainable benefits. The Group provides end-to-end solutions to make large companies and organisations more competitive by combining in-depth knowledge of a wide range of business sectors and innovative technologies with a collaborative approach. Sopra Steria places people at the heart of everything it does and is committed to putting digital to work for its clients in order to build a positive future for all. In 2024, the Group generated revenues of €5.8 billion. Job Description The world is how we shape it. Position: Snowflake - Senior Technical Lead Experience: 8-11 years Location: Noida/ Bangalore Education: B.E./ B.Tech./ MCA Primary Skills: Snowflake, Snowpipe, SQL, Data Modelling, DV 2.0, Data Quality, AWS, Snowflake Security Good to have Skills: Snowpark, Data Build Tool, Finance Domain Preferred Skills Experience with Snowflake-specific features: Snowpipe, Streams & Tasks, Secure Data Sharing. Experience in data warehousing, with at least 2 years focused on Snowflake. Hands-on expertise in SQL, Snowflake scripting (JavaScript UDFs), and Snowflake administration. Proven experience with ETL/ELT tools (e.g., dbt, Informatica, Talend, Matillion) and orchestration frameworks. Deep knowledge of data modeling techniques (star schema, data vault) and performance tuning. Familiarity with data security, compliance requirements, and governance best practices. Experience in Python, Scala, or Java for Snowpark development. Strong understanding of cloud platforms (AWS, Azure, or GCP) and related services (S3, ADLS, IAM) Key Responsibilities Define data partitioning, clustering, and micro-partition strategies to optimize performance and cost. Lead the implementation of ETL/ELT processes using Snowflake features (Streams, Tasks, Snowpipe). Automate schema migrations, deployments, and pipeline orchestration (e.g., with dbt, Airflow, or Matillion). Monitor query performance and resource utilization; tune warehouses, caching, and clustering. Implement workload isolation (multi-cluster warehouses, resource monitors) for concurrent workloads. Define and enforce role-based access control (RBAC), masking policies, and object tagging. Ensure data encryption, compliance (e.g., GDPR, HIPAA), and audit logging are correctly configured. Establish best practices for dimensional modeling, data vault architecture, and data quality. Create and maintain data dictionaries, lineage documentation, and governance standards. Partner with business analysts and data scientists to understand requirements and deliver analytics-ready datasets. Stay current with Snowflake feature releases (e.g., Snowpark, Native Apps) and propose adoption strategies. Contribute to the long-term data platform roadmap and cloud cost-optimization initiatives. Qualifications BTech/MCA Additional Information At our organization, we are committed to fighting against all forms of discrimination. We foster a work environment that is inclusive and respectful of all differences. All of our positions are open to people with disabilities. Show more Show less
Posted 1 week ago
6.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
We're hiring for one of the world's leading professional services firms, renowned for its commitment to innovation, excellence, and global impact. With a presence in over 150 countries, this organization provides services across consulting, audit, tax, risk advisory, and financial advisory — helping Fortune 500 companies and governments navigate complex challenges. Job Title: Big Data Developer Employment Type: Full-Time Employee (FTE) Location: PAN India Experience: 6+ years About the Role: We are seeking a highly skilled Big Data Developer with strong expertise in Spark and Scala to join our dynamic team. The ideal candidate will have hands-on experience with cloud platforms such as AWS, Azure, or GCP for big data processing and storage solutions. You will play a critical role in designing, developing, and maintaining scalable data pipelines and backend services using modern big data technologies. Key Responsibilities: Develop, optimize, and maintain large-scale data processing pipelines using Apache Spark and Scala Implement and manage cloud-based big data storage and processing solutions on Azure Data Lake Storage (DLS) and Azure Databricks Collaborate with cross-functional teams to understand data requirements and deliver scalable backend services using Java and Spring Boot framework Ensure best practices in data security, performance optimization, and code quality Troubleshoot and resolve production issues related to big data workflows and backend services Continuously evaluate emerging technologies and propose enhancements to current systems Must-Have Qualifications: 6+ years of experience in Big Data development Strong expertise in Apache Spark and Scala for data processing Hands-on experience with cloud platforms such as AWS, Azure, or GCP, with a strong focus on Azure Data Lake Storage (DLS) and Azure Databricks Proficient in backend development using Java and Spring Boot framework Experience in designing and implementing scalable and fault-tolerant data pipelines Solid understanding of big data architectures, ETL processes, and data modeling Excellent problem-solving skills and ability to work in an agile environment Preferred Skills: Familiarity with containerization and orchestration tools like Docker and Kubernetes Knowledge of streaming technologies such as Kafka Experience with CI/CD pipelines and automated testing frameworks What We Offer: Competitive salary of based on experience and skills Flexible working options with PAN India presence Opportunity to work with cutting-edge big data technologies in a growing and innovative company Collaborative and supportive work culture with career growth opportunities Apply for this job Show more Show less
Posted 1 week ago
2.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Join the EDG team as a Full Stack Software Engineer. The EDG team is responsible for Improve consumer experience by implementing an enterprise device gateway to manage device health signal acquisition, centralize consumer consent, facilitate efficient health signal distribution, and empower UHC with connected insights across the health and wellness ecosystem. The team has a strong and integrated relationship with the product team based on strong collaboration, trust, and partnership. Goals for the team are focused on creating meaningful positive impact for our customers through clear and measurable metrics analysis. Primary Responsibilities Write high-quality, fault tolerant code; normally 70% Backend and 30% Front-end (though the exact ratio will depend on your interest) Build high-scale systems, libraries, frameworks and create test plans Monitor production systems and provide on-call support Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications BS in Computer Science, Engineering or a related technical role or equivalent experience 2+ years experience with JS libraries and frameworks, such as Angular, React or other 2+ years experience in Scala, Java, or other compiled language Preferred Qualifications Experience with web design Experience using RESTful APIs and asynchronous JS Experience in design and development Testing experience with Scala or Java Database and caching experience, SQL and NoSQL (Postgres, Elasticsearch, or MongoDB) Proven interest in learning Scala At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission. Show more Show less
Posted 1 week ago
6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
We're hiring for one of the world's leading professional services firms, renowned for its commitment to innovation, excellence, and global impact. With a presence in over 150 countries, this organization provides services across consulting, audit, tax, risk advisory, and financial advisory — helping Fortune 500 companies and governments navigate complex challenges. Job Title: Big Data Developer Employment Type: Full-Time Employee (FTE) Location: PAN India Experience: 6+ years About the Role: We are seeking a highly skilled Big Data Developer with strong expertise in Spark and Scala to join our dynamic team. The ideal candidate will have hands-on experience with cloud platforms such as AWS, Azure, or GCP for big data processing and storage solutions. You will play a critical role in designing, developing, and maintaining scalable data pipelines and backend services using modern big data technologies. Key Responsibilities: Develop, optimize, and maintain large-scale data processing pipelines using Apache Spark and Scala Implement and manage cloud-based big data storage and processing solutions on Azure Data Lake Storage (DLS) and Azure Databricks Collaborate with cross-functional teams to understand data requirements and deliver scalable backend services using Java and Spring Boot framework Ensure best practices in data security, performance optimization, and code quality Troubleshoot and resolve production issues related to big data workflows and backend services Continuously evaluate emerging technologies and propose enhancements to current systems Must-Have Qualifications: 6+ years of experience in Big Data development Strong expertise in Apache Spark and Scala for data processing Hands-on experience with cloud platforms such as AWS, Azure, or GCP, with a strong focus on Azure Data Lake Storage (DLS) and Azure Databricks Proficient in backend development using Java and Spring Boot framework Experience in designing and implementing scalable and fault-tolerant data pipelines Solid understanding of big data architectures, ETL processes, and data modeling Excellent problem-solving skills and ability to work in an agile environment Preferred Skills: Familiarity with containerization and orchestration tools like Docker and Kubernetes Knowledge of streaming technologies such as Kafka Experience with CI/CD pipelines and automated testing frameworks What We Offer: Competitive salary of based on experience and skills Flexible working options with PAN India presence Opportunity to work with cutting-edge big data technologies in a growing and innovative company Collaborative and supportive work culture with career growth opportunities Apply for this job Show more Show less
Posted 1 week ago
6.0 years
0 Lacs
Pune, Maharashtra, India
On-site
We're hiring for one of the world's leading professional services firms, renowned for its commitment to innovation, excellence, and global impact. With a presence in over 150 countries, this organization provides services across consulting, audit, tax, risk advisory, and financial advisory — helping Fortune 500 companies and governments navigate complex challenges. Job Title: Big Data Developer Employment Type: Full-Time Employee (FTE) Location: PAN India Experience: 6+ years About the Role: We are seeking a highly skilled Big Data Developer with strong expertise in Spark and Scala to join our dynamic team. The ideal candidate will have hands-on experience with cloud platforms such as AWS, Azure, or GCP for big data processing and storage solutions. You will play a critical role in designing, developing, and maintaining scalable data pipelines and backend services using modern big data technologies. Key Responsibilities: Develop, optimize, and maintain large-scale data processing pipelines using Apache Spark and Scala Implement and manage cloud-based big data storage and processing solutions on Azure Data Lake Storage (DLS) and Azure Databricks Collaborate with cross-functional teams to understand data requirements and deliver scalable backend services using Java and Spring Boot framework Ensure best practices in data security, performance optimization, and code quality Troubleshoot and resolve production issues related to big data workflows and backend services Continuously evaluate emerging technologies and propose enhancements to current systems Must-Have Qualifications: 6+ years of experience in Big Data development Strong expertise in Apache Spark and Scala for data processing Hands-on experience with cloud platforms such as AWS, Azure, or GCP, with a strong focus on Azure Data Lake Storage (DLS) and Azure Databricks Proficient in backend development using Java and Spring Boot framework Experience in designing and implementing scalable and fault-tolerant data pipelines Solid understanding of big data architectures, ETL processes, and data modeling Excellent problem-solving skills and ability to work in an agile environment Preferred Skills: Familiarity with containerization and orchestration tools like Docker and Kubernetes Knowledge of streaming technologies such as Kafka Experience with CI/CD pipelines and automated testing frameworks What We Offer: Competitive salary of based on experience and skills Flexible working options with PAN India presence Opportunity to work with cutting-edge big data technologies in a growing and innovative company Collaborative and supportive work culture with career growth opportunities Apply for this job Show more Show less
Posted 1 week ago
8.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
The Role: We are looking for an enthusiastic Staff Data Scientist to join our growing team. The hire will be responsible for working in collaboration with other data scientists and engineers across the organization to develop production-quality models for a variety of problems across Razorpay. Some possible problems include : making recommendations to merchants from Razorpay’s suite of products, cost optimization of transactions for merchants, automatic address disambiguation / correction to enable tracking customer purchases using advanced natural language processing techniques. As part of the DS team @ Razorpay, you’ll work with some of the smartest engineers/architects/data scientists in the industry and have the opportunity to solve complex and critical problems for Razorpay. Responsibilities: Lead the data science team, providing guidance, mentorship, and technical expertise to drive impactful outcomes. Apply advanced data science methodologies, mathematics, and machine learning techniques to solve intricate and strategic business problems. Collaborate closely with cross-functional teams, including engineers, product managers, and business stakeholders, to develop and deploy robust data science solutions. Conduct in-depth analysis of large and complex datasets to extract valuable insights and drive actionable recommendations. Present findings, insights, and strategic recommendations to senior stakeholders in a clear and concise manner. Identify key metrics and develop executive-level dashboards to monitor performance and support data-driven decision-making. Oversee multiple projects concurrently, ensuring high-quality deliverables within defined timelines. Train and mentor junior data scientists, fostering their professional growth and fostering a collaborative and innovative team environment. Continuously improve and optimize data science solutions, evaluating their effectiveness and exploring new methodologies and technologies. Drive the deployment of data-driven solutions into production, ensuring seamless integration and effective communication of results. Mandatory Qualifications: 8+ years experience in a data science role, with a track record of delivering impactful solutions in a production environment. Advanced degree (Master's or Ph.D.) in a quantitative field, such as Computer Science, Statistics, Mathematics, or related disciplines. Deep knowledge and expertise in advanced machine learning techniques, statistical analysis, and mathematical modeling. Proficiency in programming languages such as Python, R, or Scala, with experience in building scalable and efficient data science workflows. Deep experience with big data processing frameworks (e.g., Hadoop, Spark) and deep learning frameworks (e.g., TensorFlow, PyTorch). Strong leadership skills, with the ability to guide and inspire a team, prioritize tasks, and meet project goals. Excellent communication and presentation skills, with the ability to convey complex concepts to both technical and non-technical stakeholders. Proven experience in driving data-driven decision-making processes and influencing strategic initiatives. Deep experience with cloud platforms (e.g., AWS, Azure, GCP) and their data science tools and services. A passion for staying up-to-date with the latest advancements in data science and actively exploring new techniques and technologies. Show more Show less
Posted 1 week ago
6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About Client: Our Client is a global IT services company headquartered in Southborough, Massachusetts, USA. Founded in 1996, with a revenue of $1.8B, with 35,000+ associates worldwide, specializes in digital engineering, and IT services company helping clients modernize their technology infrastructure, adopt cloud and AI solutions, and accelerate innovation. It partners with major firms in banking, healthcare, telecom, and media. Our Client is known for combining deep industry expertise with agile development practices, enabling scalable and cost-effective digital transformation. The company operates in over 50 locations across more than 25 countries, has delivery centers in Asia, Europe, and North America and is backed by Baring Private Equity Asia. Job description: 6 to 10 years of experience in data engineering roles with a focus on building scalable data solutions. Proficiency in Python for ETL, data manipulation, and scripting. Hands-on experience with Snowflake or equivalent cloud-based data warehouses. Strong knowledge of orchestration tools such as Apache Airflow or similar. Expertise in implementing and managing messaging queues like Kafka , AWS SQS , or similar. Demonstrated ability to build and optimize data pipelines at scale, processing terabytes of data. Experience in data modeling, data warehousing, and database design. Proficiency in working with cloud platforms like AWS, Azure, or GCP. Strong understanding of CI/CD pipelines for data engineering workflows. Experience working in an Agile development environment , collaborating with cross-functional teams. Preferred Skills: Familiarity with other programming languages like Scala or Java for data engineering tasks. Knowledge of containerization and orchestration technologies (Docker, Kubernetes). Experience with stream processing frameworks like Apache Flink . Experience with Apache Iceberg for data lake optimization and management. Exposure to machine learning workflows and integration with data pipelines. Soft Skills: Strong problem-solving skills with a passion for solving complex data challenges. Excellent communication and collaboration skills to work with cross-functional teams. Ability to thrive in a fast-paced, innovative environment. Job Title : Data Engineer Key Skills : Python for ETL,Snowflake,Apache Airflow , Kafka, AWS SQS , CI/CD pipelines,Agile development environment, Apache Iceberg, Apache Flink Job Locations : Any Virtusa Experience : 6-8 Years Education Qualification : Any Graduation Work Mode : Hybrid Employment Type : Contract Notice Period : Immediate - 10 Days Payroll : people prime Worldwide Show more Show less
Posted 1 week ago
12.0 - 14.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Associate Director - AI/ML Engineering What you will do Let’s do this. Let’s change the world. We are seeking a Associate Director of ML / AI Engineering to lead Amgen India’s AI engineering practice. This role is integral to developing top-tier talent, setting ML / AI best practices, and evangelizing ML / AI Engineering capabilities across the organization. The Associate Director will be responsible for driving the successful delivery of strategic business initiatives by overseeing the technical architecture, managing talent, and establishing a culture of excellence in ML / AI The key aspects of this role involve : (1) prior hands-on experience building ML and AI solutions (2) management experience in leading ML / AI engineering team and talent development (3) Delivering AI initiatives at enterprise scale Roles & Responsibilities: Talent Growth & People Leadership: Lead, mentor, and manage a high-performing team of engineers, fostering an environment that encourages learning, collaboration, and innovation. Focus on nurturing future leaders and providing growth opportunities through coaching, training, and mentorship. Recruitment & Team Expansion: Develop a comprehensive talent strategy that includes recruitment, retention, onboarding, and career development and build a diverse and inclusive team that drives innovation, aligns with Amgen's culture and values, and delivers business priorities Organizational Leadership: Work closely with senior leaders within the function and across the Amgen India site to align engineering goals with broader organizational objectives and demonstrate leadership by contributing to strategic discussions Create and implement a strategy for expanding the AI/ML engineering team, including recruitment, onboarding, and talent development. Oversee the end-to-end lifecycle of AI/ML projects, from concept and design through to deployment and optimization, ensuring timely and successful delivery. Ensure adoption of ML-Ops best practices, including model versioning, testing, deployment, and monitoring. Collaborate with multi-functional teams, including product, data science, and software engineering, to find opportunities and deliver AI/ML solutions that drive business value. Serve as an AI/ML evangelist across the organization, promoting awareness and understanding of the capabilities and value of AI/ML technologies. Promote a culture of innovation and continuous learning within the team, encouraging the exploration of new tools, technologies, and methodologies. Provide technical leadership and mentorship, guiding engineers in implementing scalable and robust AI/ML systems. Work closely with collaborators to prioritize AI/ML projects and ensure timely delivery of key initiatives. Lead innovation initiatives to explore new AI/ML technologies, platforms, and tools that can drive further advancements in the organization’s AI capabilities. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master’s degree and 12 to 14 years of computer science, Artificial Intelligence, Machine Learning experience OR Bachelor’s degree and 14 to 18 years of computer science, Artificial Intelligence, Machine Learning experience OR Diploma and 18 to 20 years of computer science, Artificial Intelligence, Machine Learning experience Preferred Qualifications: Experience in building AI Platforms & applications at enterprise scale Expertise in AI/ML frameworks and libraries such as TensorFlow, PyTorch, Scikit-learn, etc. Hands-on experience with LLMs, Generative AI, and NLP (e.g., GPT, BERT, Llama, Claude, Mistral AI ) Strong understanding of MLOps processes and tools such as MLflow, Kubeflow, or similar platforms. Proficient in programming languages such as Python, R, or Scala. Experience deploying AI/ML models in cloud environments (AWS, Azure, or Google Cloud). Proven track record of managing and delivering AI/ML projects at scale. Excellent project management skills, with the ability to lead multi-functional teams and manage multiple priorities. Experience in regulated industries, preferably life sciences and pharma Good-to-Have Skills: Experience with natural language processing, computer vision, or reinforcement learning. Knowledge of data governance, privacy regulations, and ethical AI considerations. Experience with cloud-native AI/ML services (Databricks, AWS, Azure ML, Google AI Platforms) Experience with AI Observability Professional Certifications (Preferred): Google Professional Machine Learning Engineer, AWS Certified Machine Learning, or Azure AI Engineer Associate, Databricks Certified Generative AI Engineer Associate Soft Skills: Excellent leadership and communication skills, with the ability to convey complex technical concepts to non-technical collaborators. Ability to foster a collaborative and innovative work environment. Strong problem-solving abilities and attention to detail. High degree of initiative and self-motivation. Ability to mentor and develop team members, promoting their growth and success. Show more Show less
Posted 1 week ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Location : Pune About Team & About Role: As a Senior Software Engineer (SSE) in the Continuous Product Development (CPD) team, you will play a key role in leading team(s) towards owning the roadmap, providing long-term stability, and providing delight to our enterprise customers. You will work closely with leadership and multiple stakeholders from other engineering teams, the Product and Support organizations. You will be working across Rubrik releases on our on-premise data backup & SAAS offering. You are expected to develop a strong understanding of our product and engineering architecture, such as our distributed job framework, data lifecycle management, filesystem, and metadata store. We are seeking a highly skilled senior engineer to join our team. You will be responsible for developing and maintaining high-performance software applications. You should have strong programming and troubleshooting skills, excellent design skills, and an understanding of distributed systems. You should be able to work independently and as part of a team. Having an understanding of the storage domain will be preferred, but is not necessary. Rubrik SSEs are self-starters, driven, and can manage themselves. We believe in giving engineers responsibility, not tasks. Our goal is to motivate and challenge you to do your best work by empowering you to make your own decisions. To do that, we have a very transparent structure and give people freedom to exercise their judgment, even in critical scenarios. This develops more capable engineers and keeps everyone engaged and happy, ultimately leading to customer delight. Key Responsibilities: Design, develop, and maintain high-quality software applications and libraries using C++, Scala, and Go programming languages. Troubleshoot complex software problems in a timely and accurate manner. Collaborate with cross-functional teams to define, design, and ship new features. Write and maintain technical documentation for software systems and applications. Participate in code reviews and ensure adherence to coding standards. Continuously improve software quality through process improvement initiatives. Keep up-to-date with emerging trends in software development. Requirements: B-Tech/M-Tech. Strong programming, problem-solving, and troubleshooting skills. Language skills: C++ or Scala/Java, or C/Go with understanding of OOP Excellent design skills. Understanding of distributed systems and multi-threading/concurrency concepts. Preferably, have a good understanding of the storage domain. Preferably, have a strong background in the object-oriented paradigm. Good knowledge of data structures, algorithms, and design patterns. Good understanding of networking protocols and security concepts. Good knowledge of software development methodologies, tools, and processes. Strong communication skills and the ability to work in a team environment. Show more Show less
Posted 1 week ago
0.0 years
0 Lacs
Gurugram, Haryana
On-site
Gurgaon,Haryana,India Job ID 767286 Join our Team About this opportunity: At Ericsson, we are offering a fantastic opportunity for a passionate and motivated Solution Architect to join our dynamic and diverse team. In this role, you will contribute to the design, construction, and management of Ericsson-based solutions. Familiarity with big data technologies, agile methodology and practices constitutes an integral part of the role. What you will do: Managing the overall operations of multiple solutions deployed within the customer environment. Customer engagement is essential to secure agreements on the proposed solutions. Prepare technical presentations, proposals, and conduct walkthroughs with customers. Lead the technical risk analysis and assist the Program Manager/Program Director in the overall risk analysis process. Manage internal and external stakeholders to identify and bridge gaps. Identify New Business Opportunities. Leading the delivery team by assigning tasks and reviewing progress. Lead User Acceptance Testing (UAT) for the Customer. Managing the L1, L2, L3, and CNS (Support) teams, as well as the customer's Operations and Maintenance (O&M) team. Identify scope creep and change requests during the delivery phase. Support Pre-Sales Activities Prepare Effort Estimation Lead Customer Presentations and Demonstrations Interface with third-party providers (3PP) and original equipment manufacturers (OEMs) to evaluate and integrate their solutions into Ericsson's offerings. Act as a Solution Lifecycle Manager for the proposed or implemented solution. Proactively develop competence in new solution areas within the domain and technologies. Mentor solution integrators, developers, and system architects, providing a transparent and open environment for growth and development. The skills you bring: Experience in architecting Large Size Products, Micro Service Architecture, Database Models Strong Experience in Development within the NMS/EMS Telecom Domain Understanding OSS/NMS-Related Standards Understanding and Experience in Telecommunications Technologies Experience in network management concepts, including inventory management, fault management, performance management, and configuration management. Experience with Network Management Protocols, including SNMP, XML, REST/JSON, TL1, and ASCII. Experience in Software Development Life Cycle Must be proficient in software architecture, application design, development, and implementation using the technologies below- Programming & Scripting -Java, Java Scripts, Shell, Python Big Data – Apache Spark, Scala Microservices CI/CD Containerization/Docker Database -Postgres, MySQL, Cassandra, Mongo Db, Elastic Search. Tools-Git, Maven, Gradle, Docker, Jenkins, JMeter, JIRA Why join Ericsson? At Ericsson, you´ll have an outstanding opportunity. The chance to use your skills and imagination to push the boundaries of what´s possible. To build solutions never seen before to some of the world’s toughest problems. You´ll be challenged, but you won’t be alone. You´ll be joining a team of diverse innovators, all driven to go beyond the status quo to craft what comes next. What happens once you apply?
Posted 1 week ago
0.0 - 5.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Role: Senior Analyst Experience: 3 to 5 years Location: Bangalore Job Description: We are looking for an experienced AI/ML Engineer, who can execute projects end to end and take then to production pipeline. The candidate is expected to lead the AI/ML work across multiple projects, working along with other ML Engineers , ensuring clear understanding and translation of requirements for the team and proposing optimized solutions. He / She would also need to work on Ad hoc timebound POC’s. Capability building and team upskill would also be expected from him / her. We would want him / her to share the knowledge with other team members and bring them up to the same level , which includes conducting learning sessions across teams and monitoring the progress of the team members. Job Responsibilities: Design and develop various machine learning and deep learning models and systems for high impact consumer applications ranging from predictive safety, content personalization’s, search, virtual assistant, time series forecasting and more. Work with a broad spectrum of state-of-the-art machine learning and deep learning technologies, in the areas of various machine learning problems such as multilingual text classification, language modelling and multi-modal learning. Create metrics and configure A/B testing to evaluate model performance offline and online to inform and convey our impacts to diverse groups of stakeholders. Analyse and produce insights from a large amount of dynamic structured and unstructured data using modern big data and streaming technologies Produce reusable code according to standard methodologies in Python, Scala or Java Collaborate with cross-functional teams of technical members and non-technical members in architecture, design, and code reviews. Strong Python programming skills Skills Required: Python (Very Strong) , Deep Learning , Image / Video Processing , Statistics AWS , Gen AI , System Architecture , Experience handling large scale real time data Job Snapshot Updated Date 10-06-2025 Job ID J_3722 Location Bengaluru, Karnataka, India Experience 3 - 8 Years Employee Type Permanent
Posted 1 week ago
0.0 - 8.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Role: Assistant Manager Experience: 5 to 8 years Location: Bengaluru, Karnataka , India (BLR) Job Description: Design and develop various machine learning and deep learning models and systems for high impact consumer applications ranging from predictive safety, content personalization’s, search, virtual assistant, time series forecasting and more. Work with a broad spectrum of state-of-the-art machine learning and deep learning technologies, in the areas of various machine learning problems such as multilingual text classification, language modelling and multi-modal learning. Create metrics and configure A/B testing to evaluate model performance offline and online to inform and convey our impacts to diverse groups of stakeholders. Analyse and produce insights from a large amount of dynamic structured and unstructured data using modern big data and streaming technologies Produce reusable code according to standard methodologies in Python, Scala or Java Collaborate with cross-functional teams of technical members and non-technical members in architecture, design, and code reviews. Job Responsibilities: Lead dedicated team of software engineers to architect, build, deploy and support best-in-class software services that are always available to provide the best gaming experience for our customers. Build and develop a high energy, committed, motivated engineering team focusing on engineering and operational excellence to deliver awesome business results. Collaborate across business units and product teams to develop and execute against the team's vision, strategy, and roadmap. Use technical expertise and industry trends to influence software development standard methodologies. Handle day-to-day activities of engineering team using Agile/Scrum methodology Skills Required: Data Science , Machine Learning , Artificial Intelligence , Statistics , Deep Learning Job Snapshot Updated Date 10-06-2025 Job ID J_3720 Location Bengaluru, Karnataka, India Experience 5 - 8 Years Employee Type Permanent
Posted 1 week ago
0.0 - 5.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Role: Senior Analyst Experience: 3 to 5 years Location: Bangalore Job Description: Focus on ML model load testing and creation of E2E test cases Evaluate models’ scalability and latency by running suites of metrics under different RPS and creating and automating the test cases for individual models, ensuring a smooth rollout of the models Enhance monitoring of model scalability, and handle incident of increased error rate Collaborate with existing machine learning engineers, backend engineers and QA test engineers from cross-functional teams Job Responsibilities: Focus on ML model load testing and creation of E2E test cases Evaluate models’ scalability and latency by running suites of metrics under different RPS and creating and automating the test cases for individual models, ensuring a smooth rollout of the models Enhance monitoring of model scalability, and handle incident of increased error rate Collaborate with existing machine learning engineers, backend engineers and QA test engineers from cross-functional teams Skills Required Databricks ; mlFlow ; Seldon ; Kubeflow ; Tecton ; Jenkins ; AWS services , At least one of the programing languages among (Java, Python, Scala) ; ML Load Testing , Job Monitoring , Evaluate Scalability & latency of models , Good communication skills ; Experience with production level Models , worked on models with high data volumes with low latency Job Snapshot Updated Date 10-06-2025 Job ID J_3721 Location Bengaluru, Karnataka, India Experience 3 - 5 Years Employee Type Permanent
Posted 1 week ago
0.0 - 16.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Category: Software Development/ Engineering Main location: India, Karnataka, Bangalore Position ID: J0625-0079 Employment Type: Full Time Position Description: Company Profile: Founded in 1976, CGI is among the largest independent IT and business consulting services firms in the world. With 94,000 consultants and professionals across the globe, CGI delivers an end-to-end portfolio of capabilities, from strategic IT and business consulting to systems integration, managed IT and business process services and intellectual property solutions. CGI works with clients through a local relationship model complemented by a global delivery network that helps clients digitally transform their organizations and accelerate results. CGI Fiscal 2024 reported revenue is CA$14.68 billion and CGI shares are listed on the TSX (GIB.A) and the NYSE (GIB). Learn more at cgi.com. Position: Manage Consulting Expert- AI Architect Experience: 13-16 years Category: Software Development/ Engineering Shift Timing: General Shift Location: Bangalore Position ID: J0625-0079 Employment Type: Full Time Education Qualification: Bachelor's degree in Computer Science or related field or higher with minimum 13 years of relevant experience. We are looking for an experienced and visionary AI Architect with a strong engineering background and hands-on implementation experience to lead the development and deployment of AI-powered solutions. The ideal candidate will have a minimum of 13–16 years of experience in software and AI systems design, including extensive exposure to large language models (LLMs), vector databases, and modern AI frameworks such as LangChain. This role requires a balance of strategic architectural planning and tactical engineering execution, working across teams to bring intelligent applications to life. Your future duties and responsibilities: Design robust, scalable architectures for AI/ML systems, including LLM-based and generative AI solutions. Lead the implementation of AI features and services in enterprise-grade products with clear, maintainable code. Develop solutions using LangChain, orchestration frameworks, and vector database technologies. Collaborate with product managers, data scientists, ML engineers, and business stakeholders to gather requirements and translate them into technical designs. Guide teams on best practices for AI system integration, deployment, and monitoring. Define and implement architecture governance, patterns, and reusable frameworks for AI applications. Stay current with emerging AI trends, tools, and methodologies to continuously enhance architecture strategy. Oversee development of Proof-of-Concepts (PoCs) and Minimum Viable Products (MVPs) to validate innovative ideas. Ensure systems are secure, scalable, and high-performing in production environments. Mentor junior engineers and architects to build strong AI and engineering capabilities within the team. Required qualifications to be successful in this role: Must to have Skills- 13–16 years of overall experience in software development, with at least 5+ years in AI/ML system architecture and delivery. Proven expertise in developing and deploying AI/ML models in production environments. Deep knowledge of LLMs, LangChain, prompt engineering, RAG (retrieval-augmented generation), and vector search. Strong programming and system design skills with a solid engineering foundation. Exceptional ability to communicate complex concepts clearly to technical and non-technical stakeholders. Experience with Agile methodologies and cross-functional team leadership. Programming Languages: Python, Java, Scala, SQL AI/ML Frameworks: LangChain, TensorFlow, PyTorch, Scikit-learn, Hugging Face Transformers Data Processing: Apache Spark, Kafka, Pandas, Dask Vector Stores & Retrieval Systems: FAISS, Pinecone, Weaviate, Chroma Cloud Platforms: AWS (SageMaker, Lambda), Azure (ML Studio, OpenAI), Google Cloud AI MLOps & DevOps: Docker, Kubernetes, MLflow, Kubeflow, Airflow, CI/CD tools (GitHub Actions, Jenkins) Databases: PostgreSQL, MongoDB, Redis, BigQuery, Snowflake Tools & Platforms: Databricks, Jupyter Notebooks, Git, Terraform Good to have Skills- Solution Engineering and Implementation Experience in AI Project. Skills: AWS Machine Learning English GitHub Python Jenkins Kubernetes Prometheus Snowflake What you can expect from us: Together, as owners, let’s turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, you’ll reach your full potential because… You are invited to be an owner from day 1 as we work together to bring our Dream to life. That’s why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company’s strategy and direction. Your work creates value. You’ll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. You’ll shape your career by joining a company built to grow and last. You’ll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our team—one of the largest IT and business consulting services firms in the world.
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Scala is a popular programming language that is widely used in India, especially in the tech industry. Job seekers looking for opportunities in Scala can find a variety of roles across different cities in the country. In this article, we will dive into the Scala job market in India and provide valuable insights for job seekers.
These cities are known for their thriving tech ecosystem and have a high demand for Scala professionals.
The salary range for Scala professionals in India varies based on experience levels. Entry-level Scala developers can expect to earn around INR 6-8 lakhs per annum, while experienced professionals with 5+ years of experience can earn upwards of INR 15 lakhs per annum.
In the Scala job market, a typical career path may look like: - Junior Developer - Scala Developer - Senior Developer - Tech Lead
As professionals gain more experience and expertise in Scala, they can progress to higher roles with increased responsibilities.
In addition to Scala expertise, employers often look for candidates with the following skills: - Java - Spark - Akka - Play Framework - Functional programming concepts
Having a good understanding of these related skills can enhance a candidate's profile and increase their chances of landing a Scala job.
Here are 25 interview questions that you may encounter when applying for Scala roles:
As you explore Scala jobs in India, remember to showcase your expertise in Scala and related skills during interviews. Prepare well, stay confident, and you'll be on your way to a successful career in Scala. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.