Jobs
Interviews

6093 Scala Jobs - Page 32

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 years

0 Lacs

Coimbatore, Tamil Nadu, India

On-site

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will also participate in testing and debugging processes to ensure the applications function as intended, while continuously seeking opportunities for improvement and efficiency in your work. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Assist in the documentation of application specifications and user guides. - Engage in code reviews to ensure quality and adherence to best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform. - Strong understanding of data integration and ETL processes. - Experience with cloud computing platforms and services. - Familiarity with programming languages such as Python or Scala. - Knowledge of data visualization techniques and tools. Additional Information: - The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform. - This position is based at our Coimbatore office. - A 15 years full time education is required., 15 years full time education

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Bhubaneswar, Odisha, India

On-site

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will also participate in testing and debugging processes to ensure the applications function as intended, while continuously seeking opportunities for improvement and efficiency in your work. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Assist in the documentation of application specifications and user guides. - Engage in code reviews to ensure quality and adherence to best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform. - Strong understanding of data integration and ETL processes. - Experience with cloud computing platforms and services. - Familiarity with programming languages such as Python or Scala. - Knowledge of data visualization techniques and tools. Additional Information: - The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform. - This position is based at our Bhubaneswar office. - A 15 years full time education is required., 15 years full time education

Posted 2 weeks ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Core Responsibilities Design and optimize batch/streaming data pipelines using Scala, Spark, and Kafka Implement real-time tokenization/cleansing microservices in Java Manage production workflows via Apache Airflow (batch scheduling) Conduct root-cause analysis of data incidents using Spark/Dynatrace logs Monitor EMR clusters and optimize performance via YARN/Dynatrace metrics Ensure data security through HashiCorp Vault (Transform Secrets Engine) Validate data integrity and configure alerting systems Requirements Technical Requirements Programming :Scala (Spark batch/streaming), Java (real-time microservices) Big Data Systems: Apache Spark, EMR, HDFS, YARN resource management Cloud & Storage :Amazon S3, EKS Security: HashiCorp Vault, tokenization vs. encryption (FPE) Orchestration :Apache Airflow (batch scheduling) Operational Excellence Spark log analysis, Dynatrace monitoring, incident handling, data validation Mandatory Competencies Expertise in distributed data processing (Spark on EMR/Hadoop) Proficiency in shell scripting and YARN job management Ability to implement format-preserving encryption (tokenization solutions) Experience with production troubleshooting (executor logs, metrics, RCA) Benefits Benefits Insurance - Family Term Insurance PF Paid Time Off - 20 days Holidays - 10 days Flexi timing Competitive Salary Diverse & Inclusive workspace

Posted 2 weeks ago

Apply

7.0 years

0 Lacs

Greater Kolkata Area

Remote

About Tala Tala is on a mission to unleash the economic power of the Global Majority – the 4 billion people overlooked by existing financial systems. With nearly half a billion dollars raised from equity and debt, we are serving millions of customers across three continents. Tala has been named by the Fortune Impact 20 list, CNBC ’s Disruptor 50 five years in a row, CNBC ’s World's Top Fintech Company, Forbes’ Fintech 50 list for eight years running, and Chief's The New Era of Leadership Award. We are expanding across product offerings, countries and crypto and are looking for people who have an entrepreneurial spirit and are passionate about our m ission. By creating a unique platform that enables lending and other financial services around the globe, people in emerging markets are able to start and expand small businesses, manage day-to-day needs, and pursue their financial goals with confidence. Currently, over nine million people across Kenya, the Philippines, Mexico, and India have used Tala products. Due to our global team, we have a remote-first approach, and also have offices in Santa Monica, CA (HQ); Nairobi, Kenya; Mexico City, Mexico; Manila, the Philippines; and Bangalore, India. Most Talazens join us because they connect with our mission. If you are energized by the impact you can make at Tala, we’d love to hear from you! The Role We are looking for a Software Development Engineer in Test (Sr. SDET) who is passionate about using cutting-edge test automation tools and technologies to create robust test frameworks and test suites in order to speed up the product delivery process without compromising quality and performance. This hands-on, roll-up-your-sleeves role will ensure quality control across Tala’s backend services and mobile applications in multiple markets. What You'll Do Develop test automation suites to expand test coverage to all our microservices and integrate it into regression and release process Support the team for all releases and communicate issues to product in a timely manner Deliver tools to help test integration with third-party services (Payment Rails, SMS/Email, KYC (know your customer), Analytics, etc.) Integrate automated tests execution into the software development process Collaborate with Squad Lead and other key stakeholders to strategize rollout Support and collaborate with cross-functional teams (Product, Data, Credit, and Business Development) in identifying automation areas to further increase the quality of our product Maintain, review, propose and implement improvements to existing frameworks, tools and processes Continuous Learning of new tools and helping the team in optimizing the test framework consistently. What You'll Need Minimum 7+ years of experience in building test tools and frameworks using Groovy/Kotlin/ Java/ Scala, Jenkins or similar CI-CD tools, Linux/ Mac OSX for either backend or frontend services Experience with BDD testing paradigm and one of the Behavior Driven Testing frameworks such as Spock or JBehave or cucumber Knowledge in microservices and automation Experience working in an Agile development process (Scrum/ Kanban) Experience with one or more version control systems such as Github, Gitlab Our vision is to build a new financial ecosystem where everyone can participate on equal footing and access the tools they need to be financially healthy. We strongly believe that inclusion fosters innovation and we’re proud to have a diverse global team that represents a multitude of backgrounds, cultures, and experience. We hire talented people regardless of race, religion, color, national origin, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status.

Posted 2 weeks ago

Apply

10.0 - 18.0 years

20 - 35 Lacs

Chennai

Work from Office

We're a technology company laser-focused on improving how people learn the language and behavioural skills they need to thrive in intercultural working environments. Our global presence in 14 countries across three continents and a staff of 30 nationalities help us find solutions for enterprises to break down cultural barriers and unleash their international teams' full potential. We are looking for a Engineering Manager in Chennai at the earliest possible starting date. What you will do here: Work with multiple product development teams of engineers to design, develop, and test products and components using an agile, scrum methodology. Highly motivated self-starter who loves ownership and responsibility while working in a collaborative and interdependent team environment. Responsible for creating and providing innovative solutions that meet not only functional requirements, but also performance, scalability, and reliability requirements. Continue to build an effective development team, setting goals, mentor and do performance reviews of team members. Delivery of quality applications on time and on budget Management and execution against project plans and delivery commitments. Follow the software engineering best practices and audit the process and improve the standards of the practices. Build, Guide and Coach the Scrum Team on how to use Agile practices and principles to deliver high quality products, facilitate and support all scrum events. Ensure security, availability, resilience, and scalability of solutions developed by the teams. Drive and manage the bug triage process, represent development team in project meetings to ensure efficient testing and bug fixing process, and be an effective advocate for the development group. What were looking for: Experience leading a team of 10 or more engineers. 5+ years experience in Open-source technology like PHP, Python. 5+ years experience with relational DBs like SQL Server or MySQL. Hands on Experience with Docker or Kubernetes. Experience in any one of the PHP frameworks like Symfony / Zend / Drupal / Magento Experience in working frontend frameworks like React JS / Angular JS Experience in RESTful services, Microservice Architecture and Serverless Architecture. Experience working with cloud environment like AWS / Azure. Experience working within an Agile/Scrum and CI/CD environment. Experience working with version control using GitLab / GitHub. Experience in the design of new systems or redesign of existing systems to meet business requirements, changing needs, or newer technology. Experience or Knowledge of one or more Front-End frameworks will be a strong plus. Experience or Knowledge with NoSQL Database like MongoDB will be a plus. Experience or Knowledge with AI/Machine Learning is a plus.

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Description Required Skills: 3 + years of hands-on experience in data modeling, ETL processes, developing reporting systems and data engineering (e.g., ETL, Big Query, SQL, Python or Alteryx) Advanced knowledge in SQL programming and database management. 3 + years of solid experience with one or more Business Intelligence reporting tools such as Power BI, Qlik Sense, Looker or Tableau. Knowledge of data warehousing concepts and best practices. Excellent problem-solving and analytical skills. Detailed oriented with strong communication and collaboration skills. Ability to work independently and as part of a team. Preferred Skills: Experience with GCP cloud services including BigQuery, Cloud Composer, Dataflow, CloudSQL, Looker and Looker ML, Data Studio, GCP QlikSense. Strong SQL skills and various BI/Reporting tools to build self-serve reports, analytic dashboards and ad-hoc packages leveraging our enterprise data warehouse. 1+ year experience with Python. 1+ year experience with Hive/Spark/Scala/JavaScript. Strong experience consuming data models, developing SQL, addressing data quality issues, proposing BI solution architecture, articulating best practices in end-user visualizations. Development delivery experience. Solid understanding of BI tools, architectures, and visualization solutions. Inquisitive, proactive, and interested in learning new tools and techniques. Strong oral, written and interpersonal communication skills Comfortable working in a dynamic environment where problems are not always well-defined. Responsibilities Develop and maintain data pipelines, reporting and dashboards using SQL and Business Intelligence reporting tools such as Power BI, Qlik Sense and Looker. Develop and execute database queries by applying advanced knowledge of SQL and experience working with relational databases and Google BigQuery. Collaborate with stakeholders to define requirements from problem statements and develop data-driven insights. Perform data validation and code review to assure data accuracy and data quality/integrity across all systems. Perform root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. Qualifications Bachelor's degree in Computer Science, Computer Information Systems, or related field. 3 + years of hands-on experience in data modeling, ETL processes, developing reporting systems and data engineering (e.g., ETL, Big Query, SQL, Python or Alteryx) Advanced knowledge in SQL programming and database management. 3 + years of solid experience with one or more Business Intelligence reporting tools such as Power BI, Qlik Sense, Looker or Tableau. Knowledge of data warehousing concepts and best practices. Excellent problem-solving and analytical skills. Detailed oriented with strong communication and collaboration skills. Ability to work independently and as part of a team.

Posted 2 weeks ago

Apply

0 years

0 Lacs

Mumbai Metropolitan Region

Remote

About Us We are an innovative AI SaaS venture that develops cutting-edge AI solutions and provides expert consulting services. Our mission is to empower businesses with state-of-the-art AI technologies and data-driven insights. We're seeking a talented Data Engineer to join our team and help drive our product development and consulting initiatives. Job Overview For our Q4 2025 and 2026+ ambition, we are looking for a motivated Intern in Data Engineering (Azure). You will assist in building and maintaining foundational data pipelines and architectures under the guidance of senior team members. This role focuses on learning Azure tools (ADF, Databricks, Pyspark, Scala, python), supporting data ingestion/transformation workflows, and contributing to scalable solutions for AI-driven projects. Tasks Tasks Develop basic data pipelines using Azure Data Factory , Azure Synapse Analytics , or Azure Databricks . Assist in ingesting structured/semi-structured data from sources (e.g., APIs, databases, files) into Azure Data Lake Storage (ADLS) . Write simple SQL queries and scripts for data transformation and validation. Write simple Pyspark, scala and python code if required Monitor pipeline performance and troubleshoot basic issues. Collaborate with AI/ML teams to prepare datasets for model training. Document workflows and adhere to data governance standards. Requirements Preferred Qualifications Basic knowledge of AI/ML concepts. Bachelor in any stream mentioned in (Engineering, Science & Commerce). Basic understanding of Azure services (Data Factory, Synapse, ADLS, SQL Database, Databricks, Azure ML). Familiarity with SQL, Python, or Pyspark, Scala for scripting. Exposure to data modeling and ETL/ELT processes. Ability to work in Agile/Scrum teams Benefits What We Offer Cutting-edge Technology: Opportunity to work on cutting-edge AI projects and shape the future of data visualization Rapid Growth: Be part of a high-growth startup with endless opportunities for career advancement. Impactful Work: See your contributions make a real difference in how businesses operate. Collaborative Culture: Join a diverse team of brilliant minds from around the world. Flexible Work Environment: Enjoy remote work options and a healthy work-life balance. Competitive Compensation as per market. We’re excited to welcome passionate, driven individuals who are eager to learn and grow with our team. If you’re ready to gain hands-on experience, contribute to meaningful projects, and take the next step in your professional journey, we encourage you to apply. We look forward to exploring the possibility of having you onboard. Follow us for more updates: https://www.linkedin.com/company/ingeniusai/posts/

Posted 2 weeks ago

Apply

5.0 - 10.0 years

13 - 18 Lacs

Pune

Work from Office

Software Developers collaborate with Business and Quality Analysts, Designers, Project Managers and more to design software solutions that will create meaningful change for our clients. They listen thoughtfully to understand the context of a business problem and write clean and iterative code to deliver a powerful end result whilst consistently advocating for better engineering practices. By balancing strong opinions with a willingness to find the right answer, Software Developers bring integrity to technology, ensuring all voices are heard. For a team to thrive, it needs collaboration and room for healthy, respectful debate. Developers are the technologists who cultivate this environment while driving teams toward delivering on an aspirational tech vision and acting as mentors for more junior-level consultants. You will leverage deep technical knowledge to solve complex business problems and proactively assess your teams health, code quality and nonfunctional requirements. Job responsibilities You will learn and adopt best practices like writing clean and reusable code using TDD, pair programming and design patterns You will use and advocate for continuous delivery practices to deliver high-quality software as well as value to end customers as early as possible You will work in collaborative, value-driven teams to build innovative customer experiences for our clients You will create large-scale distributed systems out of microservices You will collaborate with a variety of teammates to build features, design concepts and interactive prototypes and ensure best practices and UX specifications are embedded along the way. You will apply the latest technology thinking from our to solve client problems You will efficiently utilize DevSecOps tools and practices to build and deploy software, advocating devops culture and shifting security left in development You will oversee or take part in the entire cycle of software consulting and delivery from ideation to deployment and everything in between You will act as a mentor for less-experienced peers through both your technical knowledge and leadership skills Job qualifications Technical Skills We are looking for an experienced Scala Developer with 5+ years of expertise in building scalable data processing solutions. Excellent Scala and Apache Spark development skills Experience with HDFS, Hive, Impala Proficiency in OOP, design patterns, and coding best practices Experience in building real-time analytics applications, microservices, and ETL pipelines You are comfortable with Agile methodologies, such as Extreme Programming (XP), Scrum and/or Kanban You have a good awareness of TDD, continuous integration and continuous delivery approaches/tools Bonus points if you have working knowledge of cloud technology such as AWS, Azure, Kubernetes and Docker Professional Skills You enjoy influencing others and always advocate for technical excellence while being open to change when needed Presence in the external tech community: you willingly share your expertise with others via speaking engagements, contributions to open source, blogs and more Youre resilient in ambiguous situations and can approach challenges from multiple perspectives

Posted 2 weeks ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Kochi

Remote

Senior Data Engineer (Databricks) REMOTE Location: Remote (Portugal) Type: Contract Experience: 5+ Years Language: Fluent English required We are looking for a Senior Data Engineer to join our remote consulting team. In this role, you'll be responsible for designing, building, and optimizing large-scale data processing systems using Databricks and modern data engineering tools. You’ll collaborate closely with data scientists, analysts, and technical teams to deliver scalable and reliable data platforms. Key Responsibilities Design, develop, and maintain robust data pipelines for processing structured/unstructured data Build and manage data lakes and data warehouses optimized for analytics Optimize data workflows for performance, scalability, and cost-efficiency Collaborate with stakeholders to gather data requirements and translate them into scalable solutions Implement data governance, data quality, and security best practices Migrate legacy data processes (e.g., from SAS) to modern platforms Document architecture, data models, and pipelines Required Qualifications 5+ years of experience in data engineering or related fields 3+ years of hands-on experience with Databricks Strong command of SQL and experience with PostgreSQL, MySQL, or NoSQL databases Programming experience in Python, Java, or Scala Experience with ETL processes, orchestration frameworks, and data pipeline automation Familiarity with Spark, Kafka, or similar big data tools Experience working on cloud platforms (AWS, Azure, or GCP) Prior experience migrating from SAS is a plus Excellent communication skills in English

Posted 2 weeks ago

Apply

5.0 - 9.0 years

13 - 18 Lacs

Bengaluru

Work from Office

Senior Software Developers collaborate with Business and Quality Analysts, Designers, Project Managers and more to design software solutions that will create meaningful change for our clients. They listen thoughtfully to understand the context of a business problem and write clean and iterative code to deliver a powerful end result whilst consistently advocating for better engineering practices. By balancing strong opinions with a willingness to find the right answer, Senior Software Developers bring integrity to technology, ensuring all voices are heard. For a team to thrive, it needs collaboration and room for healthy, respectful debate. Senior Developers are the technologists who cultivate this environment while driving teams toward delivering on an aspirational tech vision and acting as mentors for more junior-level consultants. You will leverage deep technical knowledge to solve complex business problems and proactively assess your teams health, code quality and nonfunctional requirements. Job responsibilities You will learn and adopt best practices like writing clean and reusable code using TDD, pair programming and design patterns You will use and advocate for continuous delivery practices to deliver high-quality software as well as value to end customers as early as possible You will work in collaborative, value-driven teams to build innovative customer experiences for our clients You will create large-scale distributed systems out of microservices You will collaborate with a variety of teammates to build features, design concepts and interactive prototypes and ensure best practices and UX specifications are embedded along the way. You will apply the latest technology thinking from our to solve client problems You will efficiently utilize DevSecOps tools and practices to build and deploy software, advocating devops culture and shifting security left in development You will oversee or take part in the entire cycle of software consulting and delivery from ideation to deployment and everything in between You will act as a mentor for less-experienced peers through both your technical knowledge and leadership skills Job qualifications Technical Skills You have experience using Scala and one more programming language (Golang or C#.net or Nodejs or Python) with experience in Object-Oriented programming You can skillfully write high-quality, well-tested code and you are comfortable with Object-Oriented programming You are comfortable with Agile methodologies, such as Extreme Programming (XP), Scrum and/or Kanban You have a good awareness of TDD, continuous integration and continuous delivery approaches/tools Bonus points if you have working knowledge of cloud technology such as AWS, Azure, Kubernetes and Docker Professional Skills You enjoy influencing others and always advocate for technical excellence while being open to change when needed Presence in the external tech community: you willingly share your expertise with others via speaking engagements, contributions to open source, blogs and more Youre resilient in ambiguous situations and can approach challenges from multiple perspectives.

Posted 2 weeks ago

Apply

4.0 - 5.0 years

6 - 7 Lacs

Pune

Work from Office

We are looking for a skilled and experienced Data Scientist to analyze large amounts of raw information to find patterns and insights Your primary responsibility will be to use data to develop solutions that will help drive business decisions and strategies ",jobkeywords:"Data Science, Data Analytics, Machine Learning, Spark, TensorFlow, PyTorch, scikit-learn", jobresponsibilities:"Analyze large amounts of information to discover trends and patterns Build predictive models and machine-learning algorithms using frameworks such as TensorFlow, PyTorch, and scikit-learn Utilize Apache Spark for large-scale data processing and analytics Combine models through ensemble modeling Present information using data visualization techniques Propose solutions and strategies to business challenges Collaborate with engineering and product development teams ",jobProven experience as a Data Scientist or Data Analyst Experience in data mining and large-scale data processing Understanding of machine-learning and operations research Proficiency in programming languages such as Python and R, with experience in ML frameworks like TensorFlow, PyTorch, and scikit-learn Strong experience with Apache Spark Knowledge of SQL; familiarity with Scala or Java is an asset Experience using business intelligence tools (e g , Tableau) and data frameworks (e g , Spark) Analytical mind and business acumen Strong math skills (e g , statistics, algebra) Excellent communication skills ",

Posted 2 weeks ago

Apply

8.0 - 13.0 years

12 - 16 Lacs

Gurugram

Remote

The Data engineer is responsible for managing and operating upon Tableau, Tableau bridge server, Databricks, Dbt, SSRS, SSIS, AWS DWS, AWS APP Flow, PowerB I. The engineer will work closely with the customer and team to manage and operate cloud data platform. Job Description: Provides Level 3 operational coverage: Troubleshooting incident/problem, includes collecting logs, cross-checking against known issues, investigate common root causes (for example failed batches, infra related items such as connectivity to source, network issues etc.) Knowledge Management: Create/update runbooks as needed / Entitlements Governance: Watch all the configuration changes to batches and infrastructure (cloud platform) along with mapping it with proper documentation and aligning resources. Communication: Lead and act as a POC for customer from off-site, handling communication, escalation, isolating issues and coordinating with off-site resources while level setting expectation across stakeholders Change Management: Align resources for on-demand changes and coordinate with stakeholders as required Request Management: Handle user requests if the request is not runbook-based create a new KB or update runbook accordingly Incident Management and Problem Management, Root cause Analysis, coming up with preventive measures and recommendations such as enhancing monitoring or systematic changes as needed. SKILLS Good hands on Tableau, Tableau bridge server, Databricks, Dbt, SSRS, SSIS, AWS DWS, AWS APP Flow, PowerB I. Ability to read and write sql and stored procedures. Good hands on experience in configuring, managing and troubleshooting along with general analytical and problem solving skills. Excellent written and verbal communication skills. Ability to communicate technical info and ideas so others will understand. Ability to successfully work and promote inclusiveness in small groups. JOB COMPLEXITY: This role requires extensive problem solving skills and the ability to research an issue, determine the root cause, and implement the resolution; research of various sources such as databricks/AWS/tableau documentation that may be required to identify and resolve issues. Must have the ability to prioritize issues and multi-task. EXPERIENCE/EDUCATION: Requires a Bachelors degree in computer science or other related field plus 8+ years of hands-on experience in configuring and managing AWS/tableau and databricks solution. Experience with Databricks and tableau environment is desired.

Posted 2 weeks ago

Apply

7.0 - 12.0 years

14 - 18 Lacs

Gurugram, Bengaluru

Work from Office

Summary The Data engineer is responsible for managing and operating upon Databricks, Dbt, SSRS, SSIS, AWS DWS, AWS APP Flow, PowerBI/Tableau . The engineer will work closely with the customer and team to manage and operate cloud data platform. Job Description Leads Level 4 operational coverage: Resolving pipeline issues / Proactive monitoring for sensitive batches / RCA and retrospection of issues and documenting defects. Design, build, test and deploy fixes to non-production environment for Customer testing. Work with Customer to deploy fixes on production upon receiving Customer acceptance of fix. Cost / Performance optimization and Audit / Security including any associated infrastructure changes Troubleshooting incident/problem, includes collecting logs, cross-checking against known issues, investigate common root causes (for example failed batches, infra related items such as connectivity to source, network issues etc.) Knowledge Management: Create/update runbooks as needed . Governance: Watch all the configuration changes to batches and infrastructure (cloud platform) along with mapping it with proper documentation and aligning resources. Communication: Lead and act as a POC for customer from off-site, handling communication, escalation, isolating issues and coordinating with off-site resources while level setting expectation across stakeholders Change Management: Align resources for on-demand changes and coordinate with stakeholders as required Request Management: Handle user requests if the request is not runbook-based create a new KB or update runbook accordingly Incident Management and Problem Management, Root cause Analysis, coming up with preventive measures and recommendations such as enhancing monitoring or systematic changes as needed. Skill Good hands-on Databricks, Dbt, SSRS, SSIS, AWS DWS, AWS APP Flow, PowerB I/ Tableau Ability to read and write sql and stored procedures. Good hands-on experience in configuring, managing and troubleshooting along with general analytical and problem-solving skills. Excellent written and verbal communication skills. Ability to communicate technical info and ideas so others will understand. Ability to successfully work and promote inclusiveness in small groups. Experience/Education Requires a Bachelors degree in computer science or other related field plus 10+ years of hands-on experience in configuring and managing AWS/tableau and databricks solutions. Experience with Databricks and tableau environment is desired. JOb Complexity This role requires extensive problem-solving skills and the ability to research an issue, determine the root cause, and implement the resolution; research of various sources such as databricks/AWS/tableau documentation that may be required to identify and resolve issues. Must have the ability to prioritize issues and multi-task. Work Location - Remote Work From Home Shift: Rotation Shifts (24/7)

Posted 2 weeks ago

Apply

6.0 - 11.0 years

18 - 22 Lacs

Hyderabad, Gurugram, Bengaluru

Work from Office

Overview We are seeking an experienced Data Architect with extensive expertise in designing and implementing modern data architectures. This role requires strong software engineering principles, hands-on coding abilities, and experience building data engineering frameworks. The ideal candidate will have a proven track record of implementing Databricks-based solutions in the healthcare industry, with expertise in data catalog implementation and governance frameworks. About the Role As a Data Architect, you will be responsible for designing and implementing scalable, secure, and efficient data architectures on the Databricks platform. You will lead the technical design of data migration initiatives from legacy systems to modern Lakehouse architecture, ensuring alignment with business requirements, industry best practices, and regulatory compliance. Key Responsibilities Design and implement modern data architectures using Databricks Lakehouse platform Lead the technical design of Data Warehouse/Data Lake migration initiatives from legacy systems Develop data engineering frameworks and reusable components to accelerate delivery Establish CI/CD pipelines and infrastructure-as-code practices for data solutions Implement data catalog solutions and governance frameworks Create technical specifications and architecture documentation Provide technical leadership to data engineering teams Collaborate with cross-functional teams to ensure alignment of data solutions Evaluate and recommend technologies, tools, and approaches for data initiatives Ensure data architectures meet security, compliance, and performance requirements Mentor junior team members on data architecture best practices Stay current with emerging technologies and industry trends Qualifications Extensive experience in data architecture design and implementation Strong software engineering background with expertise in Python or Scala Proven experience building data engineering frameworks and reusable components Experience implementing CI/CD pipelines for data solutions Expertise in infrastructure-as-code and automation Experience implementing data catalog solutions and governance frameworks Deep understanding of Databricks platform and Lakehouse architecture Experience migrating workloads from legacy systems to modern data platforms Strong knowledge of healthcare data requirements and regulations Experience with cloud platforms (AWS, Azure, GCP) and their data services Bachelor's degree in computer science, Information Systems, or related field; advanced degree preferred Technical Skills Programming languages: Python and/or Scala (required) Data processing frameworks: Apache Spark, Delta Lake CI/CD tools: Jenkins, GitHub Actions, Azure DevOps Infrastructure-as-code (optional): Terraform, CloudFormation, Pulumi Data catalog tools: Databricks Unity Catalog, Collibra, Alation Data governance frameworks and methodologies Data modeling and design patterns API design and development Cloud platforms: AWS, Azure, GCP Container technologies: Docker, Kubernetes Version control systems: Git SQL and NoSQL databases Data quality and testing frameworks Optional - Healthcare Industry Knowledge Healthcare data standards (HL7, FHIR, etc.) Clinical and operational data models Healthcare interoperability requirements Healthcare analytics use cases

Posted 2 weeks ago

Apply

5.0 - 10.0 years

19 - 25 Lacs

Hyderabad, Gurugram, Bengaluru

Work from Office

As a full spectrum integrator, we assist hundreds of companies to realize the value, efficiency, and productivity of the cloud. We take customers on their journey to enable, operate, and innovate using cloud technologies from migration strategy to operational excellence and immersive transformation. If you like a challenge, youll love it here, because were solving complex business problems every day, building and promoting great technology solutions that impact our customers success. The best part is, were committed to you and your growth, both professionally and personally. You will be part of a team designing, automating, and deploying services on behalf of our customers to the cloud in a way that allows these services to automatically heal themselves if things go south. We have deep experience applying cloud architecture techniques in virtually every industry. Every week is different and the problems you will be challenged to solve are constantly evolving. We build solutions using infrastructure-as-code so our customers can refine and reuse these processes again and again - all without having to come back to us for additional deployments. Key Responsibilities Create well-designed, documented, and tested software features that meet customer requirements. Identify and address product bugs, deficiencies, and performance bottlenecks. Participate in an agile delivery team, helping to ensure the technical quality of the features delivered across the team, including documentation, testing strategies, and code. Help determine technical feasibility and solutions for business requirements. Remain up-to-date on emerging technologies and architecture and propose ways to use them in current and upcoming projects. Leverage technical knowledge to cut scope while maintaining or achieving the overall goals of the product. Leverage technical knowledge to improve the quality and efficiency of product applications and tools. Willingness to travel to client locations and deliver professional services Qualifications Experience developing software in GCP, AWS, or Azure 5+ yrs experience developing applications in Java 3+ years required with at least one other programming language such as , Scala, Python, Go, C#, Typescript, Ruby. Experience with relational databases, including designing complex schemas and queries. Experience developing within distributed systems or a microservice based architecture. Strong verbal and written communication skills for documenting workflows, tools, or complex areas of a codebase. Ability to thrive in a fast-paced environment and multi-task efficiently. Strong analytical and troubleshooting skills. 3+ years of experience as a technical specialist in Customer-facing roles Experience with Agile development methodologies Experience with Continuous Integration and Continuous Delivery (CI/CD) Preferred Qualifications Experience with GCP Building applications using Container and serverless technologies Cloud Certifications Good exposure to Agile software development and DevOps practices such as Infrastructure as Code (IaC), Continuous Integration and automated deployment Exposure to Continuous Integration (CI) tools (e.g. Jenkins) Strong practical application development experience on Linux and Windows-based systems Experience working directly with customers, partners or third-party developers Location- Remote,Bangalore,Gurgaon,Hyderabad

Posted 2 weeks ago

Apply

6.0 - 10.0 years

13 - 17 Lacs

Chennai

Work from Office

Capgemini Invent Capgemini Invent is the digital innovation, consulting and transformation brand of the Capgemini Group, a global business line that combines market leading expertise in strategy, technology, data science and creative design, to help CxOs envision and build whats next for their businesses. Your role You act as a contact person for our customers and advise them on data-driven projects. You are responsible for architecture topics and solution scenarios in the areas of Cloud Data Analytics Platform, Data Engineering, Analytics and Reporting. Experience in Cloud and Big Data architecture. Responsibility for designing viable architectures based on Microsoft Azure, AWS, Snowflake, Google (or similar) and implementing analytics. Experience in DevOps, Infrasturcure as a code, DataOps, MLOps. Experience in business development (as well as your support in the proposal process). Data warehousing, data modelling and data integration for enterprise data environments. Experience in design of large scale ETL solutions integrating multiple / heterogeneous systems. Experience in data analysis, modelling (logical and physical data models) and design specific to a data warehouse / Business Intelligence environment (normalized and multi-dimensional modelling). Experience with ETL tools primarily Talend and/or any other Data Integrator tools (Open source / proprietary), extensive experience with SQL and SQL scripting (PL/SQL & SQL query tuning and optimization) for relational databases such as PostgreSQL, Oracle, Microsoft SQL Server and MySQL etc., and on NoSQL like MongoDB and/or document-based databases. Must be detail oriented, highly motivated and work independently with minimal direction. Excellent written, oral and interpersonal communication skills with ability to communicate design solutions to both technical and non-technical audiences. IdeallyExperience in agile methods such as safe, scrum, etc. IdeallyExperience on programming languages like Python, JavaScript, Java/ Scala etc. Your Profile Provides data services for enterprise information strategy solutions - Works with business solutions leaders and teams to collect and translate information requirements into data to develop data-centric solutions. Design and develop modern enterprise data centric solutions (e.g. DWH, Data Lake, Data Lakehouse) Responsible for designing of data governance solutions. What you will love about working here We recognize the significance of flexible work arrangements to provide support . Be it remote work, or flexible work hours, you will get an environment to maintain healthy work life balance. At the heart of our mission is your career growth. Our array of career growth programs and diverse professions are crafted to support you in exploring a world of opportunities. Equip yourself with valuable certifications in the latest technologies such as Generative AI. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, cloud and data, combined with its deep industry expertise and partner ecosystem. The Group reported 2023 global revenues of 22.5 billion.

Posted 2 weeks ago

Apply

6.0 - 8.0 years

22 - 25 Lacs

Chennai

Work from Office

As a Senior Software Engineer on our team, you will design, develop, and maintain features of the Ally product- You ll also communicate and partner cross-functionally with teams in product and software development- In this role, you will work on an ethical product, using Scala for the backend and JavaScript for the frontend- We run our applications in the AWS cloud and use Git for version control- You ll work on a distributed team, collaborating with colleagues around the globe- Required skills/qualifications: 6 -8 years of relevant experience Frontend development in Angular/Typescript Backend development in Scala, Java, C#, or other object-oriented programming language Willingness to break things and make them work again Familiarity with the full-cycle development process Experience developing, building, testing, deploying, and operating application Fluency in written and spoken English Preferred skills/qualifications: Familiarity with working with cloud technologies Functional programming experiences such as Haskell or Scala

Posted 2 weeks ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Pune

Work from Office

Roles and Responsibility Design, develop, and implement scalable Kafka infrastructure solutions. Collaborate with cross-functional teams to identify and prioritize project requirements. Develop and maintain technical documentation for Kafka infrastructure projects. Troubleshoot and resolve complex issues related to Kafka infrastructure. Ensure compliance with industry standards and best practices for Kafka infrastructure. Participate in code reviews and contribute to the improvement of the overall code quality. Job Requirements Strong understanding of Kafka architecture and design principles. Experience with Kafka tools such as Streams, KSQL, and SCADA. Proficient in programming languages such as Java, Python, or Scala. Excellent problem-solving skills and attention to detail. Ability to work collaboratively in a team environment. Strong communication and interpersonal skills.

Posted 2 weeks ago

Apply

4.0 - 7.0 years

25 - 30 Lacs

Ahmedabad

Work from Office

ManekTech is looking for Data Engineer to join our dynamic team and embark on a rewarding career journey Liaising with coworkers and clients to elucidate the requirements for each task. Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed. Reformulating existing frameworks to optimize their functioning. Testing such structures to ensure that they are fit for use. Preparing raw data for manipulation by data scientists. Detecting and correcting errors in your work. Ensuring that your work remains backed up and readily accessible to relevant coworkers. Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs.

Posted 2 weeks ago

Apply

10.0 - 15.0 years

9 - 13 Lacs

Gurugram

Work from Office

About the Role: Grade Level (for internal use): 10 Role Sr. React Fullstack Developer The Team C&RS (Credit & Risk Solutions) is part of the Market Intelligence group within S&P Global. Financial Risk Analytics (FRA) delivers information-centric capital markets and risk solutions for trading desks and their risk business partners, supporting risk regulatory compliance. The UI products cover counterparty credit risk, xVA and market risk for both Buy and Sell side firms. We are currently investing in technology and data platform to develop a number of new revenue generating products, leveraging open-source, big data and cloud technologies. This role is for a software developer within the FRA software engineering team, building React (Typescript) UI applications, services and working with databases/cloud. Responsibilities Design and implement UI applications and services. Participate in system architecture and design decisions. Continuously improve development and testing best practices. Interpret and analyse business use-cases and translate feature requests into technical designs and development tasks. Take ownership of development tasks, participate in regular design and code review meetings. Delivery focused and keen to participate in the successful implementation and evolution of technology products in close coordination with product managers and colleagues. Basic Qualification Bachelors degree in Computer Science, Applied Mathematics, Engineering, or a related discipline, or equivalent experience. 10 + years of strong software development experience React, Typescript/js (ES6) Node.js (express) Experience with SQL relational databases such as Postgresql Demonstrable experience of using Restful API in a production setting. Test frameworks (e.g. jest, jasmine, playwright) Understanding of CI/CD pipelines Linux/Unix, Git Agile and XP (Scrum, Kanban, TDD) Desirable Highcharts, Devextreme, tanstack React Components, Bootstrap, HTML5 Understanding and implementation of security and data protection Gitlab, containerization platform AWS - CLI, Cloudfront, Cognito, S3 Python, Java/Scala What's in for you You can effectively manage timelines and enjoy working within a team You can follow relevant technology trends, actively evaluate new technologies, and use this information to improve the product You get a lot of satisfaction from on-time delivery Happy clients are important to you You take pride in your work Competencies You love to solve complex problems, whether that's making the user experience as responsive as possible or understanding complex client requirements You can confidently present your own ideas and solutions, as well as guide technical discussions. Your welcoming attitude encourages people to approach you when they have a problem you can help them solve About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, andmake decisions with conviction.For more information, visit www.spglobal.com/marketintelligence . Whats In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technologythe right combination can unlock possibility and change the world.Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Health & WellnessHealth care coverage designed for the mind and body. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIts not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awardssmall perks can make a big difference. For more information on benefits by country visithttps://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected andengaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, pre-employment training or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. ---- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ---- , SWP Priority Ratings - (Strategic Workforce Planning)

Posted 2 weeks ago

Apply

4.0 - 7.0 years

6 - 10 Lacs

Hyderabad, Gurugram, Ahmedabad

Work from Office

About the Role: Grade Level (for internal use): 10 The Team: Do you love to collaborate & provide solutions? This team comes together across eight different locations every single day to craft enterprise grade applications that serve a large customer base with growing demand and usage. You will use a wide range of technologies and cultivate a collaborative environment with other internal teams. The Impact: We focus primarily developing, enhancing and delivering required pieces of information & functionality to internal & external clients in all client-facing applications. You will have a highly visible role where even small changes have very wide impact. Whats in it for you? - Opportunities for innovation and learning new state of the art technologies - To work in pure agile & scrum methodology Responsibilities: Design, and implement software-related projects. Perform analyses and articulate solutions. Design underlying engineering for use in multiple product offerings supporting a large volume of end-users. Develop project plans with task breakdowns and estimates. Manage and improve existing solutions. Solve a variety of complex problems and figure out possible solutions, weighing the costs and benefits. What were Looking For: Basic Qualifications: Bachelor's degree in Computer Science or Equivalent 7+ years related experience Passionate, smart, and articulate developer Strong C#, .Net and SQL skills Experience implementing: Web Services (with WCF, RESTful JSON, SOAP, TCP), Windows Services, and Unit Tests Dependency Injection Able to demonstrate strong OOP skills Able to work well individually and with a team Strong problem-solving skills Good work ethic, self-starter, and results-oriented Agile/Scrum experience a plus. Exposure to Data Engineering and Big Data technologies like Hadoop, Big data processing engines/Scala, Nifi and ETL is a plus. Experience of Container platforms is a plus Experience working in cloud computing environment like AWS, Azure , GCP etc. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Health & Wellness: Health care coverage designed for the mind and body. Family Friendly Perks: Its not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected andengaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert: If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to . S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, pre-employment training or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity . ----------------------------------------------------------- S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. ----------------------------------------------------------- , SWP Priority Ratings - (Strategic Workforce Planning)

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Teamwork makes the stream work. Roku is changing how the world watches TV Roku is the #1 TV streaming platform in the U.S., Canada, and Mexico, and we've set our sights on powering every television in the world. Roku pioneered streaming to the TV. Our mission is to be the TV streaming platform that connects the entire TV ecosystem. We connect consumers to the content they love, enable content publishers to build and monetize large audiences, and provide advertisers unique capabilities to engage consumers. From your first day at Roku, you'll make a valuable - and valued - contribution. We're a fast-growing public company where no one is a bystander. We offer you the opportunity to delight millions of TV streamers around the world while gaining meaningful experience across a variety of disciplines. About the Team Roku pioneered TV streaming and continues to innovate and lead the industry. Continued success relies on investing in the Roku Content Platform, so we deliver high quality streaming TV experience at a global scale. As part of our Content Platform team you join a small group of highly skilled engineers, that own significant responsibility in crafting, developing and maintaining our large-scale backend systems, data pipelines, storage, and processing services. We provide all insights in regard to all content on Roku Devices. About the Role We are looking for a Senior Software Engineer with vast experience in backend development, Data Engineering and Data Analytics to focus on building next level content platform and data intelligence, which empowers Search, Recommendation, and many more critical systems across Roku Platform. This is an excellent role for a senior professional who enjoys a high level of visibility, thrives on having a critical business impact, able to make critical decisions and is excited to work on a core data platform component which is crucial for many streaming components at Roku. What You’ll Be Doing Work closely with product management team, content data platform services, and other internal consumer teams to contribute extensively to our content data platform and underlying architecture. Build low-latency and optimized streaming and batch data pipelines to enable downstream services. Build and support our Micro-services based Event-Driven Backend Systems & Data Platform. Design and build data pipelines for batch, near-real-time, and real-time processing. Participate in architecture discussions, influence product roadmap, and take ownership and responsibility over new projects. We’re excited if you have 8+ years professional experience as a Software Engineer. Proficiency in Java/Scala/Python. Deep understanding of backend technologies, architecture patterns, and best practices, including microservices, RESTful APIs, message queues, caching, and databases. Strong analytical and problem-solving skills, data structures and algorithms, with the ability to translate complex technical requirements into scalable and efficient solutions. Experience with Micro-service and event-driven architectures. Experience with Apache Spark and Apache Flink. Experience with Big Data Frameworks and Tools: MapReduce, Hive, Presto, HDFS, YARN, Kafka, etc. Experience with Apache Airflow or similar workflow orchestration tooling for ETL. Experience with cloud platforms: AWS (preferred), GCP, etc. Strong communication and presentation skills. BS in Computer Science; MS in Computer Science preferred. AI literacy and curiosity.You have either tried Gen AI in your previous work or outside of work or are curious about Gen AI and have explored it. Benefits Roku is committed to offering a diverse range of benefits as part of our compensation package to support our employees and their families. Our comprehensive benefits include global access to mental health and financial wellness support and resources. Local benefits include statutory and voluntary benefits which may include healthcare (medical, dental, and vision), life, accident, disability, commuter, and retirement options (401(k)/pension). Our employees can take time off work for vacation and other personal reasons to balance their evolving work and life needs. It's important to note that not every benefit is available in all locations or for every role. For details specific to your location, please consult with your recruiter. The Roku Culture Roku is a great place for people who want to work in a fast-paced environment where everyone is focused on the company's success rather than their own. We try to surround ourselves with people who are great at their jobs, who are easy to work with, and who keep their egos in check. We appreciate a sense of humor. We believe a fewer number of very talented folks can do more for less cost than a larger number of less talented teams. We're independent thinkers with big ideas who act boldly, move fast and accomplish extraordinary things through collaboration and trust. In short, at Roku you'll be part of a company that's changing how the world watches TV. We have a unique culture that we are proud of. We think of ourselves primarily as problem-solvers, which itself is a two-part idea. We come up with the solution, but the solution isn't real until it is built and delivered to the customer. That penchant for action gives us a pragmatic approach to innovation, one that has served us well since 2002. To learn more about Roku, our global footprint, and how we've grown, visit https://www.weareroku.com/factsheet. By providing your information, you acknowledge that you have read our Applicant Privacy Notice and authorize Roku to process your data subject to those terms.

Posted 2 weeks ago

Apply

1.0 - 4.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Looking for candidates with 1 to 4 years of experience in analytics – preferably from retail, banking, insurance, or investment background. What we’re looking for: 1 to 4 years of experience in analytics, preferably in retail, banking, insurance, or investment management. Proficiency in SQL for large-scale data processing and analysis is mandatory. Proficiency in Python/Scala with experience in API development is mandatory. Expertise with big data technologies, including Spark, Data Lake, Delta Lake is a plus. Strong quantitative and problem-solving skills with the ability to translate complex data into actionable insights and the ability to effectively convey technical concepts to non-technical audiences. Ability to work independently and collaboratively in a fast-paced, dynamic environment. Professional certifications such as CFA, FRM, or CPA are a plus.

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Morgan Stanley Model Validation - Vice President Profile Description We’re seeking someone to join our team as a [Vice President] for [Model validation in XVA pricing / IMM capital experience]. Firm Risk Management In the Firm Risk Management division, we advise businesses across the Firm on risk mitigation strategies, develop tools to analyze and monitor risks and lead key regulatory initiatives. Company Profile Morgan Stanley is an industry leader in financial services, known for mobilizing capital to help governments, corporations, institutions, and individuals around the world achieve their financial goals. At Morgan Stanley India, we support the Firm’s global businesses, with critical presence across Institutional Securities, Wealth Management, and Investment management, as well as in the Firm’s infrastructure functions of Technology, Operations, Finance, Risk Management, Legal and Corporate & Enterprise Services. Morgan Stanley has been rooted in India since 1993, with campuses in both Mumbai and Bengaluru. We empower our multi-faceted and talented teams to advance their careers and make a global impact on the business. For those who show passion and grit in their work, there’s ample opportunity to move across the businesses for those who show passion and grit in their work. Interested in joining a team that’s eager to create, innovate and make an impact on the world? Read on… Primary Responsibilities What you’ll do in the role: Provide independent review and validation for XVA pricing and/or IMM capital compliant with MRM policies and procedures, regulatory guidance and industry leading practices, including evaluating conceptual soundness, quality of model methodology, model limitations, data quality, and on-going monitoring of model performance Take initiatives and responsibility of end-to-end delivery of a stream of Model Validation and related Risk Management deliverables Write Model Review findings in validation documents that could be used for presentations both internally (model developers, business unit managers, Audit, various global Committees) Verbally communicate results and debate issues, challenges and methodologies with internal audiences including senior management Represent MRM team in interactions with regulatory and audit agencies as and when required Follow financial markets & business trends on a frequent basis to enhance the quality of Model Validation and related Risk Management deliverables What You’ll Bring To The Role Qualifications Skills required (essential / preferred) Masters or Doctorate degree in a quantitative discipline such as Statistics, Mathematics, Physics, Computer Science or Engineering is essential Experience in a Quant role in validation of Models, in developments of Models or in a technical role in Financial institutions e.g. Developer, is essential Strong written & verbal communication skills including debating different viewpoints and making formal presentations of complex topics to a wider audience is preferred 5+ years of relevant work experience in a Model Validation role in a bank or financial institution Proficient programmer in Python ; knowledge of other programming languages like R, Scala, MATLAB etc. is preferred Willingness to learn new and complex topics and adapt oneself (continuous learning) is preferred Working knowledge of statistical techniques, quantitative finance and programming is essential; good understanding of various complex financial instruments is preferred Knowledge of popular machine learning techniques is preferred Relevant professional certifications like CQF, CFA or progress made towards it are preferred Desire to work in a dynamic, team-oriented, fast-paced environment focusing on challenging tasks mixing fundamental, quantitative, and market-oriented knowledge and skills is essential What You Can Expect From Morgan Stanley We are committed to maintaining the first-class service and high standard of excellence that have defined Morgan Stanley for over 89 years. Our values - putting clients first, doing the right thing, leading with exceptional ideas, committing to diversity and inclusion, and giving back - aren’t just beliefs, they guide the decisions we make every day to do what's best for our clients, communities and more than 80,000 employees in 1,200 offices across 42 countries. At Morgan Stanley, you’ll find an opportunity to work alongside the best and the brightest, in an environment where you are supported and empowered. Our teams are relentless collaborators and creative thinkers, fueled by their diverse backgrounds and experiences. We are proud to support our employees and their families at every point along their work-life journey, offering some of the most attractive and comprehensive employee benefits and perks in the industry. There’s also ample opportunity to move about the business for those who show passion and grit in their work. To learn more about our offices across the globe, please copy and paste https://www.morganstanley.com/about-us/global-offices into your browser. Morgan Stanley is an equal opportunities employer. We work to provide a supportive and inclusive environment where all individuals can maximize their full potential. Our skilled and creative workforce is comprised of individuals drawn from a broad cross section of the global communities in which we operate and who reflect a variety of backgrounds, talents, perspectives, and experiences. Our strong commitment to a culture of inclusion is evident through our constant focus on recruiting, developing, and advancing individuals based on their skills and talents.

Posted 2 weeks ago

Apply

8.0 - 13.0 years

16 - 22 Lacs

Hyderabad

Work from Office

Looking for a Data Engineer with 8+ yrs exp to build scalable data pipelines on AWS/Azure, work with Big Data tools (Spark, Kafka), and support analytics teams. Must have strong coding skills in Python/Java and exp with SQL/NoSQL & cloud platforms. Required Candidate profile Strong experience in Java/Scala/Python. Worked with big data tech: Spark, Kafka, Flink, etc. Built real-time & batch data pipelines. Cloud: AWS, Azure, or GCP.

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies