Jobs
Interviews

6107 Scala Jobs - Page 41

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Skills: Microservices, Spring Boot, Multithreading, Core Java, SQL, Messaging queue, kafka, Coding, Position 1: Java Developer Experience - 6 + Years Locations- Noida, Gurgaon, Bangalore Notice Period - Immediate to 15 Days Requirement 6+ years hands-on experience in Java. Experience in building Order and Execution Management, Trading systems is required Financial experience and exposure to Trading In depth understanding of concurrent programming and experience in designing high throughput, high availability, fault tolerant distributed applications is required. Experience in building distributed applications using NoSQL technologies like Cassandra, coordination services like Zookeeper, and caching technologies like Apache Ignite and Redis strongly preferred Experience in building micro services architecture / SOA is required. Experience in message-oriented streaming middleware architecture is preferred (Kafka, MQ, NATS, AMPS) Experience with orchestration, containerization, and building cloud native applications (AWS, Azure) is a plus Experience with modern web technology such as Angular, React, TypeScript a plus Strong analytical and software architecture design skills with an emphasis on test driven development. Experience in programming languages such as Scala, python would be a plus. Experience in using Project Management methodologies such as Agile/Scrum Effective communication and presentation skills (written and verbal) are required Bachelors or masters degree in computer science or engineering Good Communication skills Kindly share below skill matrix for Java Developer along with the profile- skill Matrix Core Java (Mandatory), Multithreading, Spring, Messaging queue, Microservices ,SQL (Mandatory),Coding

Posted 2 weeks ago

Apply

7.0 - 11.0 years

0 Lacs

thiruvananthapuram, kerala

On-site

As a skilled Data Engineer with 7-10 years of experience, you will be a valuable addition to our dynamic team in India. Your primary focus will involve designing and optimizing data pipelines to efficiently handle large datasets and extract valuable business insights. Your responsibilities will include designing, building, and maintaining scalable data pipelines and architecture. You will be expected to develop and enhance ETL processes for data ingestion and transformation, collaborating closely with data scientists and analysts to meet data requirements and deliver effective solutions. Monitoring data integrity through data quality checks and ensuring compliance with data governance and security policies will also be part of your role. Leveraging cloud-based data technologies and services for storage and processing will be crucial to your success in this position. To excel in this role, you should hold a Bachelor's or Master's degree in Computer Science, Engineering, or a related field. Proficiency in SQL and practical experience with databases such as MySQL, PostgreSQL, or Oracle is essential. Your expertise in programming languages like Python, Java, or Scala will be highly valuable, along with hands-on experience in big data technologies like Hadoop, Spark, or Kafka. Familiarity with cloud platforms such as AWS, Azure, or Google Cloud is preferred. Understanding data warehousing concepts and tools such as Redshift and Snowflake, coupled with experience in data modeling and architecture design, will further strengthen your candidacy.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

thiruvananthapuram, kerala

On-site

The ideal candidate ready to join immediately can share their details via email for quick processing at nitin.patil@ust.com. Act fast for immediate attention! With over 5 years of experience, you will be responsible for designing, developing, and maintaining scalable data pipelines using Spark (PySpark or Spark with Scala). You will also build data ingestion and transformation frameworks for structured and unstructured data sources. Collaboration with data analysts, data scientists, and business stakeholders to understand requirements and deliver reliable data solutions will be a key aspect of the role. Working with large volumes of data, ensuring quality, integrity, and consistency, and optimizing data workflows for performance, scalability, and cost efficiency on cloud platforms (AWS, Azure, or GCP) are essential responsibilities. Additionally, implementing data quality checks and automation for ETL/ELT pipelines, monitoring and troubleshooting data issues in production, and performing root cause analysis will be part of your duties. You will also be expected to document technical processes, system designs, and operational procedures. Must-Have Skills: - Minimum 3 years of experience as a Data Engineer or in a similar role. - Hands-on experience with PySpark or Spark using Scala. - Strong knowledge of SQL for data querying and transformation. - Experience working with any cloud platform (AWS, Azure, or GCP). - Solid understanding of data warehousing concepts and big data architecture. - Familiarity with version control systems like Git. Good-to-Have Skills: - Experience with data orchestration tools such as Apache Airflow, Databricks Workflows, or similar. - Knowledge of Delta Lake, HDFS, or Kafka. - Familiarity with containerization tools like Docker/Kubernetes. - Exposure to CI/CD practices and DevOps principles. - Understanding of data governance, security, and compliance standards.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

noida, uttar pradesh

On-site

The ideal candidate for this position should have at least 5 years of experience and must be ready to join immediately. In this role, you will be responsible for designing, developing, and maintaining scalable data pipelines using Spark, specifically PySpark or Spark with Scala. You will also be tasked with building data ingestion and transformation frameworks for structured and unstructured data sources. Collaboration with data analysts, data scientists, and business stakeholders to understand requirements and deliver reliable data solutions is a key aspect of this role. Working with large volumes of data to ensure quality, integrity, and consistency is crucial. Additionally, optimizing data workflows for performance, scalability, and cost efficiency on cloud platforms such as AWS, Azure, or GCP is a significant part of the responsibilities. Implementing data quality checks and automation for ETL/ELT pipelines, monitoring and troubleshooting data issues in production, and performing root cause analysis are also essential tasks. Documentation of technical processes, system designs, and operational procedures is expected. The must-have skills for this role include at least 3 years of experience as a Data Engineer or in a similar role, hands-on experience with PySpark or Spark using Scala, strong knowledge of SQL for data querying and transformation, experience working with any cloud platform (AWS, Azure, or GCP), a solid understanding of data warehousing concepts and big data architecture, and experience with version control systems like Git. Good-to-have skills for this position include experience with data orchestration tools like Apache Airflow, Databricks Workflows, or similar, knowledge of Delta Lake, HDFS, or Kafka, familiarity with containerization tools such as Docker/Kubernetes, exposure to CI/CD practices and DevOps principles, and an understanding of data governance, security, and compliance standards. If you meet the qualifications and are interested in this exciting opportunity, please share your details via email at nitin.patil@ust.com for quick processing. Act fast to grab this immediate attention!,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

faridabad, haryana

On-site

The project is a dynamic solution empowering companies to optimize promotional activities for maximum impact. It collects and validates data, analyzes promotion effectiveness, plans calendars, and integrates seamlessly with existing systems. The tool enhances vendor collaboration, negotiates better deals, and employs machine learning to optimize promotional plans, enabling companies to make informed decisions and maximize return on investment. The required technology stack includes Scala, Go-Lang, Docker, Kubernetes, Databricks, with Python as an optional skill. The working time zone for this position is EU, and the specialty sought is Data Science. The ideal candidate should have more than 5 years of experience and English Upper-Intermediate language proficiency. Key soft skills desired for this role include a preference for problem-solving style over experience, ability to clarify requirements with the customer, willingness to pair with other engineers when solving complex issues, and good communication skills. The essential hard skills required for this position are experience in Scala and/or Go for designing and building scalable high-performing applications, containerization and microservices orchestration using Docker and Kubernetes, building data pipelines and ETL solutions using Databricks, data storage and retrieval with PostgreSQL and Elasticsearch, deploying and maintaining solutions in the Azure cloud environment. Experience in Python is considered a nice-to-have skill. Responsibilities and tasks for this role include developing and maintaining distributed systems using Scala and/or Go, working with Docker and Kubernetes for containerization and microservices orchestration, building data pipelines and ETL solutions using Databricks, working with PostgreSQL and Elasticsearch for data storage and retrieval, and deploying and maintaining solutions in the Azure cloud environment.,

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Jobs 03/18/2020 Carmatec is looking for passionate DevOps Engineers to be a part of our InstaCarma team. Not only will you have the chance to make your mark as an established DevOps Engineer, but you will also get to work and interact with seasoned professionals deeply committed to revolutionize the Cloud scenario. Job Responsibilities Work on Infrastructure provisioning/configuration management too ls. We use Packer, Terraform and Chef. Develop automation tools/scripts. We use Bash/Python/Ruby Responsible for Continuous integration and artefact management. We use Jenkins and Artifactory Setup automated deployment pipelines for microservices running as Docker containers. Setup monitoring, alerting and metrics scraping for java/scala/play applications using Prometheus and Graylog2 integrated with PagerDuty and Hipchat for alerting,reporting and monitoring. Will be doing on-call Production support an d related Incident Management, reporting & Postmortem. Create runbooks, wikis for incidents, troubleshooting performed etc. Be a proactive member of your team by sharing knowledge. Resource scheduling,orchestration using Mesos/Marathon Work closely with development teams to ensure that platforms are designed with operability in mind Function well in a fast-paced, rapidly changing environment. Required Skills A basic understanding of DevOps tools and automation framework Outstanding organization, documentation, and communication skills. Must be skilled in Linux System Administration (Ubuntu/Centos) Knowledge of AWS is a must. (EC2, EBS, S3, Route53, Cloudfront, SG, IAM, RDS etc.) Strong foundation in Docker internals and troubleshooting. Should know at least one configuration management tool – Chef/Ansible/Puppet Good to have experience at least in one scripting language – Bash/Python/Ruby Experience is an at- least one NoSQL Database Systems is a plus. – Elasticsearch/Mongodb/Redis/Cassandra Experience in a CI tool like Jenkins is preferred. Good understanding of how a 3-tier architecture works. Basic knowledge in any revision control tools like Git/Subversion etc. Should have experience working with monitoring tools like Nagios, Newrelic etc. Should be proficient in log management using tools like rsyslog, logstash etc. Working knowledge of the following items – cron, haproxy/nginx, lvm, MySql, BIND (DN S), iptables. Experience in Atlassian Tools – Jira, Hipchat,Confluence will be a plus. Experience: 5+ years Location: Bangalore If the above description is of your interest, please revert to us with your updated resume to teamhr@carmatec.com Apply now Apply now

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

On-site

About Us VE3 is at the forefront of delivering cloud‑native data solutions to premier clients across finance, retail and healthcare. As a rapidly growing UK‑based consultancy, we pride ourselves on fostering a collaborative, inclusive environment where every voice is heard—and every idea can become tomorrow’s breakthrough. Role: Database Designer / Senior Data Engineer What You’ll Do Architect & Design Lead the design of modern, scalable data platforms on AWS and/or Azure, using best practices for security, cost‑optimisation and performance. Develop detailed data models (conceptual, logical, physical) and document data dictionaries and lineage. Build & Optimize Implement robust ETL/ELT pipelines using Python, SQL, Scala (as appropriate), leveraging services such as AWS Glue, Azure Data Factory, and open‑source frameworks (Spark, Airflow). Tune data stores (RDS, SQL Data Warehouse, NoSQL like Redis) for throughput, concurrency and cost. Establish real‑time data streaming solutions via AWS Kinesis, Azure Event Hubs or Kafka. Collaborate & Deliver Work closely with data analysts, BI teams and stakeholders to translate business requirements into data solutions and dashboards. Partner with DevOps/Cloud Ops to automate CI/CD for data code and infrastructure (Terraform, CloudFormation). Governance & Quality Define and enforce data governance, security and compliance standards (GDPR, ISO27001). Implement monitoring, alerting and data quality frameworks (Great Expectations, AWS CloudWatch). Mentor & Innovate Act as a technical mentor for junior engineers; run brown‑bag sessions on new cloud services or data‑engineering patterns. Proactively research emerging big‑data and streaming technologies to keep our toolset cutting‑edge. Who You Are Academic Background: Bachelor’s (or higher) in Computer Science, Engineering, IT or similar. Experience: ≥3 years in a hands‑on Database Designer / Data Engineer role, ideally within a cloud environment. Technical Skills: Languages: Expert in SQL; strong Python or Scala proficiency. Cloud Services: At least one of AWS (Glue, S3, Kinesis, RDS) or Azure (Data Factory, Data Lake Storage, SQL Database). Data Modelling: Solid understanding of OLTP vs OLAP, star/snowflake schemas, normalization & denormalization trade‑offs. Pipeline Tools: Familiarity with Apache Spark, Kafka, Airflow or equivalent. Soft Skills: Excellent communicator—able to present complex technical designs in clear, non‑technical terms. Strong analytical mindset; thrives on solving performance bottlenecks and scaling challenges. Team player—collaborative attitude in agile/scrum settings. Nice to Have Certifications: AWS Certified Data Analytics – Specialty, Azure Data Engineer Associate/Expert. Exposure to data‑science workflows (Jupyter, ML pipelines). Experience with containerized workloads (Docker, Kubernetes) for data processing. Familiarity with DataOps practices and tools (dbt, Great Expectations, Terraform). Our Commitment to Diversity We’re an equal‑opportunity employer committed to inclusive hiring. All qualified applicants—regardless of ethnicity, gender identity, sexual orientation, neurodiversity, disability status or veteran status—are encouraged to apply.

Posted 2 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

As an Akka & Scala Engineer based in Pune, you will be an integral part of our client's backend engineering team. With 4 - 8 years of experience, you will play a key role in designing, developing, and implementing distributed, resilient, and reactive systems using Akka HTTP, Akka Streams, and the Akka Actor Model. Your responsibilities will include designing and developing backend services and APIs with Scala and Akka, architecting actor-based systems, and leveraging Akka Streams for real-time data processing. You will collaborate with cross-functional teams, participate in design discussions, and ensure the high availability and performance of the system. To excel in this role, you must possess strong experience in Scala, a deep understanding of the Akka toolkit, and proficiency in building REST APIs using Akka HTTP. Experience with reactive programming, microservices architecture, SQL/NoSQL databases, and tools like Docker, Kubernetes, and CI/CD pipelines is essential. Additionally, familiarity with Akka Persistence, Kafka, monitoring tools, and cloud platforms is a plus. Apart from technical skills, we value soft skills such as strong communication, problem-solving abilities, and a willingness to learn and adapt to new technologies. By joining us, you will have the opportunity to work with top-tier product and engineering teams, tackle complex technical challenges, and be part of a collaborative work culture. If you are passionate about leveraging your expertise in Akka and Scala to build cutting-edge systems and thrive in a dynamic environment, we encourage you to apply for this rewarding opportunity.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

As a Data Engineer, you will be responsible for designing, developing, and maintaining scalable data pipelines using Spark (PySpark or Spark with Scala). Your role will involve building data ingestion and transformation frameworks for structured and unstructured data sources. Collaborating with data analysts, data scientists, and business stakeholders to understand requirements and deliver reliable data solutions will be a key aspect of your responsibilities. Additionally, you will work with large volumes of data to ensure quality, integrity, and consistency, optimizing data workflows for performance, scalability, and cost efficiency on cloud platforms such as AWS, Azure, or GCP. Implementing data quality checks and automation for ETL/ELT pipelines, monitoring and troubleshooting data issues in production, and documenting technical processes, system designs, and operational procedures are also part of your duties. To excel in this role, you should have at least 3 years of experience as a Data Engineer or in a similar role. Hands-on experience with PySpark or Spark using Scala is essential, along with a strong knowledge of SQL for data querying and transformation. You should also have experience working with any cloud platform (AWS, Azure, or GCP), a solid understanding of data warehousing concepts and big data architecture, and familiarity with version control systems like Git. While not mandatory, it would be beneficial to have experience with data orchestration tools like Apache Airflow, Databricks Workflows, or similar, knowledge of Delta Lake, HDFS, or Kafka, familiarity with containerization tools such as Docker or Kubernetes, exposure to CI/CD practices and DevOps principles, and an understanding of data governance, security, and compliance standards. If you are ready to join immediately and possess the required skills and experience, please share your details via email at nitin.patil@ust.com. Act fast for immediate attention!,

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Greater Kolkata Area

On-site

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform, Microsoft Azure Databricks, PySpark Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will engage in problem-solving discussions, contribute to the overall project strategy, and adapt to evolving requirements while maintaining a focus on delivering high-quality applications that align with business objectives. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Assist in the documentation of application processes and workflows. - Engage in continuous learning to stay updated with the latest technologies and methodologies. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform, Microsoft Azure Databricks, PySpark. - Strong understanding of data integration techniques and ETL processes. - Experience with cloud-based data storage solutions and data management. - Familiarity with programming languages such as Python or Scala. - Ability to troubleshoot and optimize application performance. Additional Information: - The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform. - This position is based at our Kolkata office. - A 15 years full time education is required., 15 years full time education

Posted 2 weeks ago

Apply

6.0 - 11.0 years

9 - 19 Lacs

Bengaluru

Hybrid

Lead : 6-8 years Focus on production cost for the techniques and features • Mentoring the team on benchmarking costs, performance KPI’s • Guarding the focus on the team towards objectives •Advanced proficiency in Python and/or Scala for data engineering tasks. •Proficiency in PySpark and Scala Spark for distributed data processing, with hands-on experience in Azure Databricks. •Expertise in Azure Databricks for data engineering, including Delta Lake, MLflow, and cluster management. •Familiarity with cloud platforms (e.g., AWS, Azure, GCP) and their big data and data warehousing services (e.g., Azure Data Factory, AWS Redshift). •Expertise in data warehousing platforms such as Snowflake, Azure Synapse Analytics, or Redshift, including schema design, ETL/ELT processes, and query optimization. •Experience with Hadoop ecosystem (HDFS, Hive, HBase, etc.), Apache Airflow for workflow orchestration and scheduling. •Advanced knowledge of SQL for data warehousing and analytics, with experience in NoSQL databases (e.g., MongoDB) as a plus. •Experience with version control systems (e.g., Git) and CI/CD pipelines. •Familiarity with Java or other programming languages is a plus.

Posted 2 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

Pune, Maharashtra, India

On-site

The Applications Development Senior Manager is a senior management level position responsible for accomplishing results through the management of a team or department in an effort to establish and implement new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to drive applications systems analysis and programming activities. Responsibilities: Manage one or more Applications Development teams in an effort to accomplish established goals as well as conduct personnel duties for team (e.g. performance evaluations, hiring and disciplinary actions) Utilize in-depth knowledge and skills across multiple Applications Development areas to provide technical oversight across systems and applications Review and analyze proposed technical solutions for projects Contribute to formulation of strategies for applications development and other functional areas Develop comprehensive knowledge of how areas of business integrate to accomplish business goals Provide evaluative judgment based on analysis of factual data in complicated and unique situations Impact the Applications Development area through monitoring delivery of end results, participate in budget management, and handling day-to-day staff management issues, including resource management and allocation of work within the team/project Ensure essential procedures are followed and contribute to defining standards negotiating with external parties when necessary Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency, as well as effectively supervise the activity of others and create accountability with those who fail to maintain these standards. Core Responsibilities: This role is for a Data Engineering Tech Lead to work on the Vanguard Big Data Platform. The team is responsible for the design, architecture, development, and maintenance/support of leading Big Data initiatives and use cases providing business value. Interface with product teams to understand their requirements to build the ingestion pipelines and conformance layer for consumption by business. Work closely with the data ingestion team to track the requirements and drive the build out of the canonical models. Provide guidance to the data conformance team for implementing the requirements / changes / enhancements to the conformance model. Do very much technical and hands-on development as part of the conformance team to deliver the business requirements. Manage the workload of the team and the scrum process to align it with the objectives and priorities of the product owners. Participate in data management activities related to the Risk and Regulatory requirements as needed. Core Skills: The Data Engineering lead will be working very closely with and managing the work of a team of data engineers working on our Big Data Platform. The lead will need the below core skills – Strong solid understanding of the Big Data architecture and the ability to trouble shoot performance and/or development issues on Hadoop (Cloudera preferably) Hands-on experience working with Hive, Impala, Kafka, HBase, Spark for data curation/conformance related work. Strong proficiency in Spark for development work related to curation/conformance. Strong Scala development (with previous Java background) preferred. Experience with Spark/Kafka or equivalent streaming/batch processing and event-based messaging. Strong data analysis skills and the ability to slice and dice the data as needed for business reporting. Experience working in an agile environment with a fast-paced changing requirement. Excellent planning and organizational skills Strong Communication skills Additional Requirements (Nice to have): Cloudera/Hortonworks/AWS EMR, Couchbase, S3 experience a plus Experience with Cloud Integration on AWS, Snowflake, Couchbase, or GCP tech stack components. Relational SQL (Oracle, SQL Server) and NoSQL (MongoDB) database integration and data distribution principles experience Experience with API development and use of JSON/XML/Hypermedia data formats. Analysis and development across Lines of business product/function including Payments, Digital Channels, Liquidities, Trade, Sales, Pricing, Client Experience having Cross train, functional and/or technical knowledge Align to Engineering Excellence Development principles and standards. Qualifications: 8-12 years of relevant experience in the Financial Service industry Experience as Applications Development Manager Experience as senior level in an Applications Development role Stakeholder and people management experience Demonstrated leadership skills Proven project management skills Basic knowledge of industry practices and standards Consistently demonstrates clear and concise written and verbal communication Education: Bachelor’s degree/University degree or equivalent experience Master’s degree preferred This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 2 weeks ago

Apply

5.0 - 8.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters Do 1. Instrumental in understanding the requirements and design of the product/ software Develop software solutions by studying information needs, studying systems flow, data usage and work processes Investigating problem areas followed by the software development life cycle Facilitate root cause analysis of the system issues and problem statement Identify ideas to improve system performance and impact availability Analyze client requirements and convert requirements to feasible design Collaborate with functional teams or systems analysts who carry out the detailed investigation into software requirements Conferring with project managers to obtain information on software capabilities 2. Perform coding and ensure optimal software/ module development Determine operational feasibility by evaluating analysis, problem definition, requirements, software development and proposed software Develop and automate processes for software validation by setting up and designing test cases/scenarios/usage cases, and executing these cases Modifying software to fix errors, adapt it to new hardware, improve its performance, or upgrade interfaces. Analyzing information to recommend and plan the installation of new systems or modifications of an existing system Ensuring that code is error free or has no bugs and test failure Preparing reports on programming project specifications, activities and status Ensure all the codes are raised as per the norm defined for project / program / account with clear description and replication patterns Compile timely, comprehensive and accurate documentation and reports as requested Coordinating with the team on daily project status and progress and documenting it Providing feedback on usability and serviceability, trace the result to quality risk and report it to concerned stakeholders 3. Status Reporting and Customer Focus on an ongoing basis with respect to project and its execution Capturing all the requirements and clarifications from the client for better quality work Taking feedback on the regular basis to ensure smooth and on time delivery Participating in continuing education and training to remain current on best practices, learn new programming languages, and better assist other team members. Consulting with engineering staff to evaluate software-hardware interfaces and develop specifications and performance requirements Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code Documenting very necessary details and reports in a formal way for proper understanding of software from client proposal to implementation Ensure good quality of interaction with customer w.r.t. e-mail content, fault report tracking, voice calls, business etiquette etc Timely Response to customer requests and no instances of complaints either internally or externally Mandatory Skills: Scala programming. Experience: 5-8 Years.

Posted 2 weeks ago

Apply

4.0 - 6.0 years

6 - 10 Lacs

Chennai

Work from Office

As a Senior Cloud Data Platform (AWS) Specialist at Incedo, you will be responsible for designing, deploying and maintaining cloud-based data platforms on the AWS platform. You will work with data engineers, data scientists and business analysts to understand business requirements and design scalable, reliable and cost-effective solutions that meet those requirements. Roles & Responsibilities: Designing, developing and deploying cloud-based data platforms using Amazon Web Services (AWS) Integrating and processing large amounts of structured and unstructured data from various sources Implementing and optimizing ETL processes and data pipelines Developing and maintaining security and access controls Collaborating with other teams to ensure the consistency and integrity of data Troubleshooting and resolving data platform issues Technical Skills Skills Requirements: In-depth knowledge of AWS services and tools such as AWS Glue, AWS Redshift, and AWS Lambda Experience in building scalable and reliable data pipelines using AWS services, Apache Spark, and related big data technologies Familiarity with cloud-based infrastructure and deployment, specifically on AWS Strong knowledge of programming languages such as Python, Java, and SQL Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Nice-to-have skills Qualifications Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred

Posted 2 weeks ago

Apply

5.0 - 8.0 years

3 - 7 Lacs

Hyderabad

Work from Office

Long Description Experienceand Expertise inany of the followingLanguagesat least 1 of them : Java, Scala, Python Experienceand expertise in SPARKArchitecture Experience in the range of 6-10 yrs plus Good Problem SolvingandAnalytical Skills Ability to Comprehend the Business requirementand translate to the Technical requirements Good communicationand collaborative skills with fellow teamandacross Vendors Familiar with development of life cycle includingCI/CD pipelines. Proven experienceand interested in supportingexistingstrategicapplications Familiarity workingwithagile methodology Mandatory Skills: Scala programming.: Experience: 5-8 Years.

Posted 2 weeks ago

Apply

5.0 - 8.0 years

6 - 10 Lacs

Pune

Hybrid

Mandatory Skills: Cloud-PaaS-GCP-Google Cloud Platform . Location: Wipro PAN India Hybrid 3 days in Wipro office JD: Strong - SQL Strong - Python Any cloud technology (AWS, azure, GCP etc) have to be excellent GCP (preferred) PySpark (preferred) Essential Skills: Proficiency in Cloud-PaaS-GCP-Google Cloud Platform. Experience Required: 5-8 years. Position: Cloud Data Engineer. Work Location: Wipro, PAN India. Work Arrangement: Hybrid model with 3 days in Wipro office. Additional Experience: 8-13 years. Job Description: - Strong expertise in SQL. - Proficient in Python. - Excellent knowledge of any cloud technology (AWS, Azure, GCP, etc.), with a preference for GCP. - Familiarity with PySpark is preferred. Mandatory Skills: Cloud-PaaS-GCP-Google Cloud Platform . JD: Strong - SQL Strong - Python Any cloud technology (AWS, azure, GCP etc) have to be excellent GCP (preferred) PySpark (preferred

Posted 2 weeks ago

Apply

5.0 - 10.0 years

7 - 11 Lacs

Bengaluru

Work from Office

Senior Software Engineer Data Were seeking a Senior Software Engineer or a Lead Software Engineer to join one of our Data Layer teams. As the name implies, the Data Layer is at the core of all things data at Zeta. Our responsibilities include: Developing and maintaining the Zeta Identity Graph platform, which collects billions of behavioural, demographic, locations and transactional signals to power people-based marketing. Ingesting vast amounts of identity and event data from our customers and partners. Facilitating data transfers across systems. Ensuring the integrity and health of our datasets. And much more. As a member of this team, the data engineer will be responsible for designing and expanding our existing data infrastructure, enabling easy access to data, supporting complex data analyses, and automating optimization workflows for business and marketing operations. Essential Responsibilities: As a Senior Software Engineer or a Lead Software Engineer, your responsibilities will include: Building, refining, tuning, and maintaining our real-time and batch data infrastructure Daily use technologies such as Spark, Airflow, Snowflake, Hive, Scylla, Django, FastAPI, etc. Maintaining data quality and accuracy across production data systems Working with Data Engineers to optimize data models and workflows Working with Data Analysts to develop ETL processes for analysis and reporting Working with Product Managers to design and build data products Working with our DevOps team to scale and optimize our data infrastructure Participate in architecture discussions, influence the road map, take ownership and responsibility over new projects Participating in on-call rotation in their respective time zones (be available by phone or email in case something goes wrong) Desired Characteristics: Minimum 5-10 years of software engineering experience. Proven long term experience and enthusiasm for distributed data processing at scale, eagerness to learn new things. Expertise in designing and architecting distributed low latency and scalable solutions in either cloud and on-premises environment. Exposure to the whole software development lifecycle from inception to production and monitoring Fluency in Python or solid experience in Scala, Java Proficient with relational databases and Advanced SQL Expert in usage of services like Spark and Hive Experience with web frameworks such as Flask, Django Experience in adequate usage of any scheduler such as Apache Airflow, Apache Luigi, Chronos etc. Experience in Kafka or any other stream message processing solutions. Experience in adequate usage of cloud services (AWS) at scale Experience in agile software development processes Excellent interpersonal and communication skills Nice to have: Experience with large scale / multi-tenant distributed systems Experience with columnar / NoSQL databases Vertica, Snowflake, HBase, Scylla, Couchbase Experience in real team streaming frameworks Flink, Storm Experience in open table formats such as Iceberg, Hudi or Deltalake

Posted 2 weeks ago

Apply

12.0 - 18.0 years

40 - 75 Lacs

Bengaluru

Hybrid

- Backend Applications using Java/J2EE, RESTful Web Services, HTTP and JSON - 5 yrs. of Techno Managerial role - Expertise in Python & Java, with a deep understanding of its ecosystems and frameworks. - Expertise with Node.js / JavaScript / Scala

Posted 2 weeks ago

Apply

5.0 - 10.0 years

30 - 32 Lacs

Pune

Hybrid

Let me tell you about the role We are looking for an Information Security Engineering Specialist with great knowledge in security fundamentals and is eager to apply them in complex environments. In this role, you will assist in implementing security controls, executing vulnerability assessments, and supporting automation initiatives. This position will have an emphasis in one or more of the following areas cloud security; infrastructure security; and/or data security. You will have an opportunity to learn and grow under the mentorship of senior engineers, while also contributing to critical security tasks that keep our organization safe. What you will deliver Define security policies that can be used to improve our cloud, infrastructure or data security posture. Integrate our vulnerability assessment tooling into our environments, to provide continuous scans, uncovering vulnerabilities, misconfiguration or potential security gaps. Work with engineering teams to support the remediation and validation of vulnerability mitigations and fixes. Integrate security validations into continuous integration/continuous deliver (CI/CD) pipelines and develop scripts to automate security tasks. Maintain clear, detailed documentation of security procedures and policies, including how to embed and measure security on our cloud, infrastructure or data environments. What you will need to be successful (experience and qualifications) Seasoned security professional with 3+ years delivering security engineering services and/or building security solutions within a complex organization. Practical experience designing, planning, productizing, maintaining and documenting reliable and scalable data, infrastructure, cloud and/or platform solutions in complex environments. Firm foundation of information and cyber security principles and standard processes. Professional and technical security certifications such as CISSP, CISM, GEVA, CEH, OSCP or equivalent are a plus. Development experience in one or more object-oriented programming languages (e.g., Python, Scala, Java, C#) and/or cloud environments (including AWS, Azure, Alibaba, etc.) Exposure/experience with full stack development. Experience with security tooling (vulnerability scanners, CNAPP, Endpoint and/or DLP) and automation and scription for security tasks (e.g., CI/CD integration). Familiarity with basic security frameworks such as NIST CSF, NIST 800-53, ISO 27001, etc. Foundational knowledge of security standards, industry laws, and regulations such as Payment Card Industry Data Security Standards (PCI-DSS), General Data Protection Regulation (GDPR), California Consumer Privacy Act (CCPA) and Sarbanes-Oxley (SOX) Continuous learning and improvement approach. This position is a hybrid of office/remote working

Posted 2 weeks ago

Apply

7.0 - 12.0 years

9 - 12 Lacs

Bengaluru

Work from Office

Responsibilities: * Design, develop, test & maintain Scala applications using Spark. * Collaborate with cross-functional teams on project delivery. * Optimize application performance through data analysis.

Posted 2 weeks ago

Apply

2.0 - 7.0 years

15 - 30 Lacs

Bengaluru

Hybrid

Role & responsibilities Design, deliver, and maintain significant features in data pipelines, ML processing, and / or service infrastructure Optimize software performance to achieve the required throughput and / or latency Work with your manager, peers, and Product Managers to scope projects and features Come up with a sound technical strategy, taking into consideration the project goals, timelines, and expected impact Take point on some cross-team efforts, taking ownership of a business problem and ensuring the different teams are in sync and working towards a coherent technical solution Take active part in knowledge sharing across the organization - both teaching and learning from others 2+ years of software design and development experience, tackling non-trivial problems in backend services and / or data pipelines A solid foundation in Data Structures, Algorithms, Object-Oriented Programming, Software Design, and core Statistics knowledge Experience in production-grade coding in Java, and Python/Scala Experience in the close examination of data and computation of statistics Experience in using and operating Big Data processing pipelines, such as: Hadoop and Spark Good verbal and written communication and collaboration skills

Posted 2 weeks ago

Apply

10.0 years

0 Lacs

Gurugram, Haryana, India

On-site

About The Role Grade Level (for internal use): 10 Role: Sr. React Fullstack Developer The Team C&RS (Credit & Risk Solutions) is part of the Market Intelligence group within S&P Global. Financial Risk Analytics (FRA) delivers information-centric capital markets and risk solutions for trading desks and their risk business partners, supporting risk regulatory compliance. The UI products cover counterparty credit risk, xVA and market risk for both Buy and Sell side firms. We are currently investing in technology and data platform to develop a number of new revenue generating products, leveraging open-source, big data and cloud technologies. This role is for a software developer within the FRA software engineering team, building React (Typescript) UI applications, services and working with databases/cloud. Responsibilities Design and implement UI applications and services. Participate in system architecture and design decisions. Continuously improve development and testing best practices. Interpret and analyse business use-cases and translate feature requests into technical designs and development tasks. Take ownership of development tasks, participate in regular design and code review meetings. Delivery focused and keen to participate in the successful implementation and evolution of technology products in close coordination with product managers and colleagues. Basic Qualification Bachelor’s degree in Computer Science, Applied Mathematics, Engineering, or a related discipline, or equivalent experience. 10 + years of strong software development experience React, Typescript/js (ES6) Node.js (express) Experience with SQL relational databases such as Postgresql Demonstrable experience of using Restful API in a production setting. Test frameworks (e.g. jest, jasmine, playwright) Understanding of CI/CD pipelines Linux/Unix, Git Agile and XP (Scrum, Kanban, TDD) Desirable Highcharts, Devextreme, tanstack React Components, Bootstrap, HTML5 Understanding and implementation of security and data protection Gitlab, containerization platform AWS - CLI, Cloudfront, Cognito, S3 Python, Java/Scala What's In For You You can effectively manage timelines and enjoy working within a team You can follow relevant technology trends, actively evaluate new technologies, and use this information to improve the product You get a lot of satisfaction from on-time delivery Happy clients are important to you You take pride in your work Competencies You love to solve complex problems, whether that's making the user experience as responsive as possible or understanding complex client requirements You can confidently present your own ideas and solutions, as well as guide technical discussions. Your welcoming attitude encourages people to approach you when they have a problem you can help them solve About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence. What’s In It For You? Our Purpose Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our Benefits Include Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring And Opportunity At S&P Global At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 284397 Posted On: 2025-07-18 Location: Gurgaon, India

Posted 2 weeks ago

Apply

5.0 years

15 - 18 Lacs

Goregaon, Maharashtra, India

Remote

Business Intelligence Developer – Mumbai (Goregaon East) 27165 Work Mode: Hybrid (4 days office, 1 day WFH) Shift Timings: 12:30 PM – 9:30 PM Location: Goregaon East, Nesco (Max 1 hour commute preferred) Interview: 2 rounds, in-person Responsibilities Design and develop ETL pipelines integrating diverse data sources into BI environments. Develop dashboards and reports using Microsoft Power BI, SSRS, and other BI tools. Ensure data quality, maintain data catalog/dictionary, and support data marts/lakes. Collaborate with business partners to understand needs and translate them into BI solutions. Lead development and maintenance of complex BI dashboards and reports. Provide user training and support adoption of BI tools. Proactively identify opportunities for business growth, risk mitigation, and efficiency. Support Microsoft BI platform technologies and innovate solutions for scalability and reuse. Must-Have Skills & Experience 5-7+ years working with Microsoft BI platform: SQL Server DB, SSIS, SSRS, SSAS, Power BI, Azure Cloud services. Strong experience building and maintaining large scale data integration and ETL processes. Proficient in data warehouse architecture, data modeling, and dashboard/report development. Expertise in optimizing data integration routines and database design. Excellent communication and documentation skills. Ability to work independently in a fast-paced environment. Nice-to-Haves Experience with other BI tools like QlikView, Tableau, MicroStrategy, or open-source reporting. Cloud-based data platforms (Azure, AWS, Snowflake). DevOps experience and CI/CD deployment knowledge. Experience with data lakes and Power BI Report Server administration. Knowledge of analytics tools like R, Python, Scala, SAS. Skills: power bi,business intelligence,communication,dashboards,azure,data,dashboard/report development,ssas,documentation,ssrs,data warehouse architecture,etl,cloud,data integration,ssis,sql server db,data modeling,azure cloud services,design,microsoft

Posted 2 weeks ago

Apply

10.0 - 15.0 years

35 - 40 Lacs

Bengaluru

Work from Office

About AQR Capital Management AQR is a global investment management firm built at the intersection of financial theory and practical application. We strive to deliver superior, long-term results for our clients by seeking to filter out market noise to identify and isolate what matters most, and by developing ideas that stand up to rigorous testing. Underpinning this philosophy is an unrelenting commitment to excellence in technology powering our insights and analysis. This unique combination has made us leaders in alternative and traditional strategies with more than 125 Bn$ of asset under management. Job description: The Team Our Bengaluru office is key component of our global Engineering strategy. Our Software engineers work in research, portfolio implementation, trading, enterprise engineering teams. Quantitative Research Development (QRD) team partners closely with business teams to build the quant models, infrastructure, applications, tools that power our quantitative research and quantitative investment process. Portfolio Implementation team is part of QRD team. Your Role As a Tech Lead in Portfolio Implementation Engineering team you will design and develop - Global asset risk estimation system incorporating large amount of data High-performance historical simulation engine Portfolio construction systems for our quantitative investment strategies Portfolio optimization systems to incorporate real world constraints on research strategies Solutions to implement business processes that rebalance the portfolios based on quantitative models and interface with trading systems to generate orders You will partner with not only local but also global team of engineers and researchers for successful product delivery. You would be expected to lead initiatives both in technology transformation and business driven projects along with significant individual contribution along with guiding and mentoring junior team members. What You ll Bring Bachelors/Masters/PhD in Computer Science, Engineering, or related discipline 10+ years of software development experience Expertise in Java programming language Outstanding coding, debugging, and analytical skills Experience of design and architecture including object-oriented design, distributed systems, cloud native applications and microservices Ability to lead technology initiatives through the development lifecycle s Ability to manage multiple workstreams with task allocation, execution and monitoring Ability to manage teams and guide team members Experience of working with cloud technologies and containers would be a plus Knowledge of other programming languages (Python, C++, Go, Scala) would be a plus Knowledge and experience of Finance is desirable Excellent communication skills both verbal and written Willingness to learn and work on new technologies and domain concepts Who You Are Mature, thoughtful, and a natural fit for a collaborative, team-oriented culture Hard-working and eager to learn in a fast-paced, innovative environment Committed to intellectual integrity, transparency, and openness Motivated by the transformational effects of technology-at-scale

Posted 2 weeks ago

Apply

6.0 - 11.0 years

8 - 14 Lacs

Bengaluru

Work from Office

GCP Cloud Architecture. Model Deployment Lifecycle Knowledge of creating Training & Serving Pipeline Familiar with any one of workflow: Kubeflow, Airflow, ML Flow, Argo etc" Strong in Python Adequate SQL skill Must have skill : Python, SQL, ML Engineer (Model Deployment/MLOPS), ML Pipeline-(Kubeflow, Airflow Flow, Argo etc,) Preferred Skill: Pytorch, TensorFlow, Exp in hiper scaler/Cloud Service, Deep learning framework,

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies