Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 - 12.0 years
0 Lacs
chennai, tamil nadu
On-site
The Applications Development Senior Programmer Analyst position is an intermediate level role where you will be responsible for participating in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. Your main objective will be to contribute to applications systems analysis and programming activities. Your responsibilities will include conducting tasks related to feasibility studies, time and cost estimates, IT planning, risk technology, applications development, model development, and establishing and implementing new or revised applications systems and programs to meet specific business needs or user areas. You will also be required to monitor and control all phases of the development process, provide user and operational support on applications to business users, and recommend and develop security measures in post-implementation analysis. As the Applications Development Senior Programmer Analyst, you will utilize in-depth specialty knowledge of applications development to analyze complex problems/issues, evaluate business and system processes, recommend advanced programming solutions, and ensure that essential procedures are followed. Additionally, you will serve as an advisor or coach to new or lower-level analysts, operate with a limited level of direct supervision, and act as a subject matter expert to senior stakeholders and other team members. To qualify for this role, you should have 8-12 years of relevant experience in systems analysis and programming of software applications, managing and implementing successful projects, and working knowledge of consulting/project management techniques/methods. You should also have the ability to work under pressure, manage deadlines, and adapt to unexpected changes in expectations or requirements. A Bachelor's degree or equivalent experience is required for this position. In addition to the general job description, the ideal candidate should have 8 to 12 years of Application development experience through the full lifecycle with expertise in UI architecture patterns such as Micro Frontend and NX. Proficiency in Core Java/J2EE Application, Data Structures, Algorithms, Hadoop, Map Reduce Framework, Spark, YARN, and other relevant technologies is essential. Experience with Big Data Spark ecosystem, ETL, BI tools, agile environment, test-driven development, and optimizing software solutions for performance and stability is also preferred. This job description provides an overview of the responsibilities and qualifications for the Applications Development Senior Programmer Analyst role. Other job-related duties may be assigned as required.,
Posted 1 day ago
8.0 - 11.0 years
35 - 37 Lacs
Kolkata, Ahmedabad, Bengaluru
Work from Office
Dear Candidate, We are looking for a Big Data Developer to build and maintain scalable data processing systems. The ideal candidate will have experience handling large datasets and working with distributed computing frameworks. Key Responsibilities: Design and develop data pipelines using Hadoop, Spark, or Flink. Optimize big data applications for performance and reliability. Integrate various structured and unstructured data sources. Work with data scientists and analysts to prepare datasets. Ensure data quality, security, and lineage across platforms. Required Skills & Qualifications: Experience with Hadoop ecosystem (HDFS, Hive, Pig) and Apache Spark. Proficiency in Java, Scala, or Python. Familiarity with data ingestion tools (Kafka, Sqoop, NiFi). Strong understanding of distributed computing principles. Knowledge of cloud-based big data services (e.g., EMR, Dataproc, HDInsight). Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies
Posted 1 week ago
6.0 - 11.0 years
15 - 19 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Project description During the 2008 financial crisis, many big banks failed or faced issues due to liquidity issues. Lack of liquidity can kill any financial institution over the night. That's why it's so critical to constantly monitor liquidity risks and properly maintain collaterals. We are looking for a number of talented developers, who would like to join our team in Pune, which is building liquidity risk and collateral management platform for one of the biggest investment banks over the globe. The platform is a set of front-end tools and back-end engines. Our platform helps the bank to increase efficiency and scalability, reduce operational risk and eliminate the majority of manual interventions in processing margin calls. Responsibilities The candidate will work on development of new functionality for Liqudity Risk platform closely with other teams over the globe. Skills Must have BigData experience (6 years+); Java/python J2EE, Spark, Hive; SQL Databases; UNIX Shell; Strong Experience in Apache Hadoop, Spark, Hive, Impala, Yarn, Talend, Hue; Big Data Reporting, Querying and analysis. Nice to have Spark Calculators based on business logic/rules Basic performance tuning and troubleshooting knowledge Experience with all aspects of the SDLC Experience with complex deployment infrastructures Knowledge in software architecture, design and testing Data flow automation (Apache NiFi, Airflow etc) Understanding of difference between OOP and Functional design approach Understanding of an event driven architecture Spring, Maven, GIT, uDeploy;
Posted 3 weeks ago
4.0 - 6.0 years
6 - 8 Lacs
Bengaluru
Work from Office
: Job TitleESG DataSustainability Business Analyst Corporate TitleAssociate LocationBangalore, India Role Description The Sustainability Data and Technology Program is a bank wide program to deliver a strategic solution for Environmental, Social and Governance data across Deutsche Bank. The Program is part of the Sustainability Strategy Key Deliverable. As a Business Analyst, you will be part of the Data Team. You will be responsible for reviewing business use cases from stakeholders, gathering & documenting requirements, defining high level implementation steps and creating business user stories. You will closely work with the Product Owner and development teams and bring business and functional analysis skills into the development team to ensure that the implementation of requirements aligns with our business needs and technical quality standards. What well offer you , 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Working with the business and technology stakeholders to define, agree and socialise requirements for ESG Data Sourcing and Transformation, needed for the Consumer base within the bank. Work with architects and engineers to ensure that both functional and non-functional requirements can be realised in the design and delivery in a way which respects the architecture strategy. Analyse complex datasets to derive insights to support requirement definition by completing the data profiling of vendor data. Define & document business requirements for review by senior stakeholders, in JIRA and other documentation tools such as Confluence, Draw.IO. Defining acceptance criteria with stakeholders and supporting user acceptance testing to ensure quality product delivery, supporting the Defect Management. Responsible for reviewing User Stories along with test cases based on appropriate interpretation of Business Liaising with business teams and development teams in Agile ceremonies such as Product Backlog Refinements to review the User Stories and to prioritise the Product Backlog, to support the requirements in its path to release in production environment. To act as a point of contact for the Development Teams for any business requirement clarifications Provide support to the Functional Analysts within the Development Teams to produce Analysis artifacts Designing & specifying data mapping to transform source system data into a format which can be consumed by other business areas within the bank Supporting the design and conceptualization of new business solution options and articulating identified impacts and risks Monitor, track issues, risks and dependencies on analysis and requirements work Your skills and experience Mandatory Skills 4+ years business analyst experience in the Banking Industry across the full project life cycle, with broad domain knowledge and understanding of core business processes, systems and data flows Experience of specifying ETL processes within Data projects Experience of a large system implementation project across multiple Business Units and across multiple geographies. It is essential that they are aware of the sort of issues that may arise with a central implementation across different locations Strong knowledge of business analysis methods (e.g. best practices in Management and UAT) Demonstrates the maturity and persuasiveness required to engage in business dialogue and support stakeholders Excellent analysis skills and good problem solving skills Ability to communicate and interpret stakeholders needs and requirements An understanding of systems delivery lifecycles and Agile delivery methodologies A good appreciation of systems and data architectures Strong discipline in data reconciliation, data integrity, controls and documentation Understanding of controls around software development to manage business requirements Ability to work in virtual teams and matrixed organizations Good team player, facilitator-negotiator and networker. Able to lead senior managers towards common goals and build consensus across a diverse group Ability to share information, transfer knowledge and expertise to team members Ability to commit to and prioritise work duties and tasks Ability to work in a fast paced environment with competing and ever changing priorities, whilst maintaining a constant focus on delivery Willingness to chip in and cover multiple roles when required such as cover for Project Managers, assisting architecture, performing testing and write ups of meeting minutes Expertise in Microsoft Office applications (Word, Excel, Visio, PowerPoint) Proficient ability to query large datasets (e.g. SQL, Hue, Impala, Python) with a view to test/analyse content and data profiling Desirable Skills In depth understanding of the aspects of ESG reporting Knowledge of ESG data vendors How well support you . . . .
Posted 1 month ago
5.0 - 9.0 years
15 - 19 Lacs
Bengaluru
Work from Office
Project description During the 2008 financial crisis, many big banks failed or faced issues due to liquidity issues. Lack of liquidity can kill any financial institution over the night. That's why it's so critical to constantly monitor liquidity risks and properly maintain collaterals. We are looking for a number of talented developers, who would like to join our team in Pune, which is building liquidity risk and collateral management platform for one of the biggest investment banks over the globe. The platform is a set of front-end tools and back-end engines. Our platform helps the bank to increase efficiency and scalability, reduce operational risk and eliminate the majority of manual interventions in processing margin calls. Responsibilities The candidate will work on development of new functionality for Liqudity Risk platform closely with other teams over the globe. Skills Must have BigData experience (6 years+); Java/python J2EE, Spark, Hive; SQL Databases; UNIX Shell; Strong Experience in Apache Hadoop, Spark, Hive, Impala, Yarn, Talend, Hue; Big Data Reporting, Querying and analysis. Nice to have Spark Calculators based on business logic/rules Basic performance tuning and troubleshooting knowledge Experience with all aspects of the SDLC Experience with complex deployment infrastructures Knowledge in software architecture, design and testing Data flow automation (Apache NiFi, Airflow etc) Understanding of difference between OOP and Functional design approach Understanding of an event driven architecture Spring, Maven, GIT, uDeploy; Other Languages EnglishB2 Upper Intermediate Seniority Senior
Posted 1 month ago
8.0 - 11.0 years
45 - 50 Lacs
Noida, Kolkata, Chennai
Work from Office
Dear Candidate, We are hiring a Julia Developer to build computational and scientific applications requiring speed and mathematical accuracy. Ideal for domains like finance, engineering, or AI research. Key Responsibilities: Develop applications and models using the Julia programming language . Optimize for performance, parallelism, and numerical accuracy . Integrate with Python or C++ libraries where needed. Collaborate with data scientists and engineers on simulations and modeling. Maintain well-documented and reusable codebases. Required Skills & Qualifications: Proficient in Julia , with knowledge of multiple dispatch and type system Experience in numerical computing or scientific research Familiarity with Plots.jl, Flux.jl, or DataFrames.jl Understanding of Python, R, or MATLAB is a plus Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies
Posted 1 month ago
8.0 - 11.0 years
35 - 37 Lacs
Kolkata, Ahmedabad, Bengaluru
Work from Office
Dear Candidate, We are hiring a Data Engineer to build and maintain data pipelines for our analytics platform. Perfect for engineers focused on data processing and scalability. Key Responsibilities: Design and implement ETL processes Manage data warehouses and ensure data quality Collaborate with data scientists to provide necessary data Optimize data workflows for performance Required Skills & Qualifications: Proficiency in SQL and Python Experience with data pipeline tools like Apache Airflow Familiarity with big data technologies (Spark, Hadoop) Bonus: Knowledge of cloud data services (AWS Redshift, Google BigQuery) Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies
Posted 1 month ago
6.0 - 10.0 years
11 - 15 Lacs
Pune
Work from Office
We at Onix Datametica Solutions Private Limited are looking for Bigdata Lead who have a passion for cloud with knowledge of different on-premise and cloud Data implementation in the field of Big Data and Analytic s including and not limiting to Teradata, Netezza, Exadata, Oracle, Cloudera, Hortonworks and alike. Ideal candidates should have technical experience in migrations and the ability to help customers get value from Datametica's tools and accelerators Job Description 6+ years of overall experience in developing, testing implementing Big data projects using Hadoop, Spark, Hive Hands-on experience playing lead role in Big data projects, responsible for implementing one or more tracks within projects, identifying and assigning tasks within the team and providing technical guidance to team members Experience in setting up Hadoop services, implementing ETL/ELT pipelines, working with Terabytes of data ingestion processing from varied systems Experience working in onshore/offshore model, leading technical discussions with customers, mentoring and guiding teams on technology, preparing HDD LDD documents Required Skills and Abilities: Mandatory Skills Spark, Scala/Pyspark, Hadoop ecosystem including Hive, Sqoop, Impala, Oozie, Hue, Java, Python, SQL, Flume, bash (shell scripting) Experience implementing CICD pipelines and working experience with tools like SCM tools such as GIT, Bit bucket, etc Hands on experience in writing data ingestion pipelines, data processing pipelines using spark and SQL, experience in implementing SCD type 1 2, auditing, exception handling mechanism Data Warehousing projects implementation with either, Scala or Hadoop programming background Proficient with various development methodologies like waterfall, agile/scrum Exceptional communication, organisation, and time management skills Collaborative approach to decision-making Strong analytical skills Good To Have - Certifications in any of GCP, AWS or Azure, Cloud era Work on multiple Projects simultaneously, prioritising appropriately
Posted 2 months ago
3.0 - 5.0 years
12 - 13 Lacs
Thane, Navi Mumbai, Pune
Work from Office
We at Acxiom Technologies are hiring for Pyspark Developer for Mumbai Location Relevant Experience : 1 to 4 Years Location : Mumbai Mode of Work : Work From Office Notice Period : Upto 20 days. Job Description: Proven experience as a Pyspark Developer . Hands-on expertise with AWS Redshift . Strong proficiency in Pyspark , Spark , Python , and Hive . Solid experience with SQL . Excellent communication skills. Benefits of working at Acxiom: - Statutory Benefits - Paid Leaves - Phenomenal Career Growth - Exposure to Banking Domain About Acxiom Technologies: Acxiom Technologies is a leading software solutions services company that provides consulting services to global firms and has established itself as one of the most sought-after consulting organizations in the field of Data Management and Business Intelligence. Also here is our website address https://www.acxtech.co.in/ to give you a detailed overview of our company. Interested Candidates can share their resumes on 7977418669 Thank you.
Posted 2 months ago
6.0 - 10.0 years
10 - 16 Lacs
Mumbai
Work from Office
Responsibilities Design and Implement Big Data solutions, complex ETL pipelines and data modernization projects. Required Past Experience: 6+ years of overall experience in developing, testing & implementing big data projects using Hadoop, Spark, Hive and Sqoop. Hands-on experience playing lead role in big data projects, responsible for implementing one or more tracks within projects, identifying and assigning tasks within the team and providing technical guidance to team members. Experience in setting up Hadoop services, implementing Extract transform and load/Extract load and transform (ETL/ELT) pipelines, working with Terabytes/Petabytes of data ingestion & processing from varied systems Experience working in onshore/offshore model, leading technical discussions with customers, mentoring and guiding teams on technology, preparing High-Level Design & Low-Level Design (HDD & LDD) documents. Required Skills and Abilities: Mandatory Skills Spark, Scala/Pyspark, Hadoop ecosystem including Hive, Sqoop, Impala, Oozie, Hue, Java, Python, SQL, Flume, bash (shell scripting) Secondary Skills Apache Kafka, Storm, Distributed systems, good understanding of networking, security (platform & data) concepts, Kerberos, Kubernetes Understanding of Data Governance concepts and experience implementing metadata capture, lineage capture, business glossary Experience implementing Continuous integration/Continuous delivery (CI/CD) pipelines and working experience with tools like Source code management (SCD) tools such as GIT, Bit bucket, etc. Ability to assign and manage tasks for team members, provide technical guidance, work with architects on High-Level Design, Low-Level Design (HDD & LDD) and Proof of concept. Hands on experience in writing data ingestion pipelines, data processing pipelines using spark and sql, experience in implementing slowly changing dimension (SCD) type 1 & 2, auditing, exception handling mechanism Data Warehousing projects implementation with either Java, or Scala based Hadoop programming background. Proficient with various development methodologies like waterfall, agile/scrum. Exceptional communication, organization, and time management skills Collaborative approach to decision-making & Strong analytical skills Good To Have - Certifications in any of GCP, AWS or Azure, Cloudera' Work on multiple Projects simultaneously, prioritizing appropriately
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough