Jobs
Interviews

8236 Hadoop Jobs - Page 9

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

You should have proficiency in Core Java and object-oriented design. Additionally, you should possess knowledge and experience in developing data-centric, web-based applications using various technologies including JSF, JSP, Java, JavaScript, Node.js, AJAX, HTML, CSS, Graph DB Titan/Janus, Elastic Search, and Tomcat/JBOSS. Experience in building REST APIs and Web Services, along with working knowledge of Agile software development, is required. You should also have experience with automated testing using JUnit and code versioning tools like SVN/Git. Understanding of design patterns and the ability to build easily configurable, deployable, and secure solutions is essential. As a part of your responsibilities, you will be planning product iterations, releasing iterations on schedule, writing reusable and efficient code, and implementing low-latency, high-availability, and high-performance applications. You will also be responsible for the implementation of security and data protection, providing analysis of problems, recommending solutions, and participating in system design, development, testing, debugging, documentation, and support. Furthermore, you should be able to translate complex functional and technical requirements into detailed designs. Desired skills for this role include 1-5 years of experience in Core Java, JSF, JSP, or Python, as well as experience in ETL, Big Data/Hadoop. Being highly tech-savvy with hands-on experience in building products from scratch is preferred. Familiarity with databases like Oracle, PostgreSQL, Cassandra, HBase, and Mongo DB is beneficial. You should be analytical, algorithmic, and logic-driven with in-depth knowledge of technology and development processes. Experience in product development in an agile environment and familiarity with API development using Node.js are advantageous. In terms of technical skills, you should be proficient in Core Java, JavaScript, Sigma.js, D3.js, Node.js, JSON, Ajax, CSS, HTML, Elastic Search, Graph DB Titan/Janus, Cassandra, HBase, Apache Tomcat, JBOSS, JUnit, and version control tools like SVN/Git. The educational qualification required for this position is a B.E/B.Tech/MCA/M.Sc./B.Sc degree, and the ideal candidate should have 3-5 years of relevant experience.,

Posted 3 days ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

As an Assistant Manager - Analytics, you will be responsible for overseeing data-driven projects, designing intricate data solutions, offering valuable insights to stakeholders, and contributing to the advancement of our Ads product and business metrics. Your role will involve delving into deep insights on Ads core product, conducting large-scale experimentation on Adtech innovation, and forecasting demand-supply to drive growth within our Ads product and a multifaceted Ads Entertainment business. The Central Analytics team, situated within various business & product teams in a matrix structure, serves as a thought partner by providing comprehensive data insights to steer strategic decisions. This team acts as a strategic enabler for JioHotstar Ads business and product functions. By analyzing consumer experience, consumer supply, advertisers demand, and our Ad serving capabilities, we aim to achieve goals (KPIs) across Ads product, Advertisers objectives, and Entertainment business planning. We implement experiments, leverage GenAI for innovative problem-solving, and construct analytical frameworks to guide key decisions. Reporting to the Manager - Product Analytics, your key responsibilities will include applying analytics knowledge in problem-solving, generating and delivering quality data insights through reports, dashboards, and structured documentation using tools like Power BI and Tableau, developing a profound understanding of the data platform and technology stack, utilizing statistical techniques for validation, effectively communicating complex data concepts to diverse audiences, partnering with stakeholders for identifying opportunities and supporting strategic decisions, managing projects end-to-end, contributing data-driven insights in experiments, and fostering a culture of innovation, collaboration, and continuous improvement. To excel in this role, you should demonstrate expertise in predictive analysis with proficiency in R, SQL, Python, and Pyspark. Familiarity with big data platforms and tools like Hadoop, Spark, and Hive is preferred. Experience in dashboard building and data visualization using tools such as Tableau and Power BI is advantageous. You should possess advanced technical skills and the ability to collect, organize, and disseminate information accurately. A background in digital analytics, clickstream data, passion for the entertainment industry, and understanding of online video streaming platforms are desirable. Experience in Adtech and OTT platforms is also preferred. Ideally, you should hold a Bachelor's or Master's degree in Engineering, Mathematics, Operational Research, Statistics, Physics, or a related technical discipline, coupled with 4-6 years of experience in Business/Product Analytics, preferably from consumer technology companies. JioStar is an equal opportunity employer that values diversity and aims to create an inclusive workplace free from discrimination.,

Posted 3 days ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a Data Engineering Specialist at BT, you will play a crucial role in shaping the data landscape for the future. Your responsibilities will include building, designing, and maintaining data platforms to ensure efficiency, scalability, and reliability. You will be tasked with turning complex business needs into data and AI solutions that support Networks data strategy. Your primary tasks will involve data engineering, managing and scaling data pipelines, ensuring data quality and integrity, designing observability solutions, collaborating with data scientists to integrate ML models, and working on data governance. Additionally, you will be driving the adoption of data visualization and supporting data storytelling, staying updated on emerging technologies, and coaching and mentoring junior engineers. To be successful in this role, you should have a strong proficiency in data engineering concepts, tools, and technologies such as AWS, GCP, Hadoop, Spark, Scala, Python, SQL, Kafka, and cloud-native services. Experience in working with large streaming data sets, data governance, data modeling, observability, DevOps, AI, ML, and domain expertise related to BT Networks products will be essential. You are expected to have a proven ability to develop and support data solutions at scale, a drive to push forward engineering standards, experience with DevOps practices, understanding of observability tools and cloud platforms, knowledge of data governance, and stakeholder management. While not mandatory, experience in bringing ML solutions to production and supporting in-life maintenance will be preferred. Joining BT means being part of a purpose-driven organization with a long history of using communication to make a better world. You will have the opportunity to work in a diverse and inclusive environment where personal, simple, and brilliant values are embraced. If you are passionate about making a real difference through digital transformation and are excited about this role, we encourage you to apply even if you do not meet every single requirement listed in the job description. Your unique background and experiences could make you the perfect candidate for this role or other opportunities within our team.,

Posted 3 days ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

The Applications Development Sr Programmer Analyst is responsible for establishing and implementing new or revised application systems and programs in coordination with the Technology team. Your main objective will be to lead applications systems analysis and programming activities. As a key member of the team, you will partner with multiple management teams to ensure appropriate integration of functions to meet goals. You will also identify and define necessary system enhancements to deploy new products and process improvements. Your role will involve resolving a variety of high impact problems/projects through in-depth evaluation of complex business processes, system processes, and industry standards. You will provide expertise in the area and advanced knowledge of applications programming, ensuring that application design adheres to the overall architecture blueprint. Utilizing advanced knowledge of system flow, you will develop standards for coding, testing, debugging, and implementation. You will play a vital role in developing and maintaining systems that power near real-time / Batch, information retrieval (IR), and grounding (RAG) services. Your responsibilities will include designing and building highly scalable web crawling and data extraction systems capable of acquiring structured and unstructured data across Citi systems efficiently. You will continuously optimize the data extraction architecture to ensure peak performance, scalability, resilience, and cost-effectiveness. Developing robust systems to process vast volumes and extract meaningful insights through advanced data processing techniques will also be part of your role. You will design and implement high-throughput data pipelines and infrastructure to handle petabyte-scale datasets seamlessly. Continuously exploring and implementing innovative solutions to complex technical challenges, pushing the boundaries of what's possible in data storage and processing will be crucial. You will drive the adoption of capabilities across regions by partnering with regional business and technology teams. Additionally, you will develop approaches to integrate data into the DataHub (EAP) and prototype methods of developing insights on that data with our business analytics teams and digital product teams. Must Have Skills: - Hadoop - Spark - Oracle - CICD - Light speed - OpenShift - Java - Microservices Qualifications: - 8+ years of relevant experience in Apps Development or systems analysis - 5+ years of programming with at least one software programming language experience - 5+ years of leading design or architecture (design patterns, reliability and scaling) of new and existing systems experience - 5+ years in technology/engineering roles within medium to large financial enterprises - Subject Matter Expert (SME) in at least one area of Applications Development - Consistently demonstrates clear and concise written and verbal communication - Experience with Python, analytic methods and frameworks is a plus - Experience with real-time streaming and stream processing frameworks is a plus - Prior experience in financial industry is a must - Proven ability to assist in attracting, developing, engaging and sustaining world-class engineering/technical talent; foster a culture of innovation and excellence Education: - Bachelors degree/University degree or equivalent experience - Masters degree preferred If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity, review Accessibility at Citi. View Citi's EEO Policy Statement and the Know Your Rights poster.,

Posted 3 days ago

Apply

200.0 years

0 Lacs

Haryana, India

On-site

Control Automation Data Science Sr Analyst (C11) - DS About CITI Citi's mission is to serve as a trusted partner to our clients by responsibly providing financial services that enable growth and economic progress. We have 200+ years of experience helping our clients meet the world's toughest challenges and embrace its greatest opportunities. About AIM: Analytics and Information Management (AIM) is a global community that is driving data driven transformation across Citi in multiple functions with the objective to create actionable intelligence for our business leaders. We are a fast-growing organization working with Citi businesses and functions across the world. What do we do: We have one formula for managing risk across the firm: our Enterprise Risk Management Framework. This consistency enables us to take an end-to-end view in how we identify, measure, build, control and report risks. We are also working to simplify, streamline and automate our manual controls—strengthening our ability to prevent issues, not just to detect them after they happen. When issues do appear, we should strive to understand the root cause. We can then apply those lessons-learned horizontally across the organization to prevent similar issues from arising elsewhere. Expertise required: Data Analysis and Modelling: Use statistical modelling analysis, Machine Learning, Data mining techniques to extract actionable insights from large and complex datasets. Develop model to forecast trends, identify patterns and solve various business challenges. Conduct exploratory data analysis to uncover hidden relationships and opportunities for optimizations. Algorithm Development: Design, develop, deploy advance algorithms to solve complex business problems. Optimize algorithms for scalability, performance, and accuracy. Stay abreast of the latest advancements in Machine Learning and Artificial Intelligence and continuously improve modelling techniques. Collaboration and communication: Collaborate with Model Risk Management and Data Engineers, business analysts, product managers to define project requirements and deliver solutions. Communicate findings and recommendations to technical and non-technical stakeholders. Act as subject matter expert on Data Science methodology and best practices. Data management and governance: Work with Data engineers to ensure availability, quality, and integrity of datasets. Develop and implement Data governance procedures to ensure compliance with regulatory requirements and industry standards. Domain Skills Good understanding of Banking domain (Wealth, Cards, Deposit, Loans & Insurance etc.) Functional Skills Business risk, controls, compliance, and data management. Nice to have - Knowledge of Finance Regulations, Understanding of Audit Process Soft Skills Should have good communication and inter-personal skills. Mentoring junior members in the team Ability to thrive in a dynamic and fast-paced environment. Contribute to organizational initiatives in wide ranging areas including competency development, training, organizational building activities etc. Proactive approach in solving problems and eye for details. A strong team player Qualifications: Bachelor’s degree in quantitative field (Computer science, engineering, mathematics, machine learning, statistics) 5-8 years of experience in Data science role Proficiency in programming languages like Python, R, or SAS Experience in Python and relevant libraries (Numpy, Pandas, Scikit learn etc.) Coursework related to Machine learning, Deep learning (NLP / OCR) and Programming At least an introductory understanding of LLMs and prompt engineering Hands on application experience using common ML frameworks such as TensorFlow, PyTorch, OpenAI and LangChain Experience with big data platforms (e.g. Hadoop, Spark) and SQL databases Demonstrated ability to write high quality code, develop Machine Learning models Excellent verbal and written communication skills Problem solving and Story telling skills to provide recommendations and generate actionable Business Insights. ------------------------------------------------------ Job Family Group: Decision Management ------------------------------------------------------ Job Family: Specialized Analytics (Data Science/Computational Statistics) ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 3 days ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

Remote

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving discussions, contribute to the overall project strategy, and continuously refine your skills to enhance application performance and user experience. Roles & Responsibilities: The Offshore Data Engineer plays a critical role in designing, building, and maintaining scalable data pipelines and infrastructure to support business intelligence, analytics, and machine learning initiatives. Working closely with onshore data architects and analysts, this role ensures high data quality, performance, and reliability across distributed systems. The engineer is expected to demonstrate technical proficiency, proactive problem-solving, and strong collaboration in a remote environment. -Design and develop robust ETL/ELT pipelines to ingest, transform, and load data from diverse sources. -Collaborate with onshore teams to understand business requirements and translate them into scalable data solutions. -Optimize data workflows through automation, parallel processing, and performance tuning. -Maintain and enhance data infrastructure including data lakes, data warehouses, and cloud platforms (AWS, Azure, GCP). -Ensure data integrity and consistency through validation, monitoring, and exception handling. -Contribute to data modeling efforts for both transactional and analytical use cases. -Deliver clean, well-documented datasets for reporting, analytics, and machine learning. -Proactively identify opportunities for cost optimization, governance, and process automation. Professional & Technical Skills: - Programming & Scripting: Proficiency in Databricks with SQL and Python for data manipulation and pipeline development. - Big Data Technologies: Experience with Spark, Hadoop, or similar distributed processing frameworks. -Workflow Orchestration: Hands-on experience with Airflow or equivalent scheduling tools. -Cloud Platforms: Strong working knowledge of cloud-native services (AWS Glue, Azure Data Factory, GCP Dataflow). -Data Modeling: Ability to design normalized and denormalized schemas for various use cases. -ETL/ELT Development: Proven experience in building scalable and maintainable data pipelines. -Monitoring & Validation: Familiarity with data quality frameworks and exception handling mechanisms. Good To have Skills -DevOps & CI/CD: Exposure to containerization (Docker), version control (Git), and deployment pipelines. -Data Governance: Understanding of metadata management, lineage tracking, and compliance standards. -Visualization Tools: Basic knowledge of BI tools like Power BI, Tableau, or Looker. -Machine Learning Support: Experience preparing datasets for ML models and feature engineering. Additional Information: - The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform. - This position is based at our Chennai office. - A 15 years full time education is required., 15 years full time education

Posted 3 days ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Consultant, Advisors & Consulting Services, Performance Analytics Consultant – Performance Analytics Advisors & Consulting Services Services within Mastercard is responsible for acquiring, engaging, and retaining customers by managing fraud and risk, enhancing cybersecurity, and improving the digital payments experience. We provide value-added services and leverage expertise, data-driven insights, and execution. Our Advisors & Consulting Services team combines traditional management consulting with Mastercard’s rich data assets, proprietary platforms, and technologies to provide clients with powerful strategic insights and recommendations. Our teams work with a diverse global customer base across industries, from banking and payments to retail and restaurants. The Advisors & Consulting Services group has five specializations: Strategy & Transformation, Performance Analytics, Business Experimentation, Marketing, and Program Management. Our Performance Analytics consultants translate data into insights by leveraging Mastercard and customer data to design, implement, and scale analytical solutions for customers. They use qualitative and quantitative analytical techniques and enterprise applications to synthesize analyses into clear recommendations and impactful narratives. Positions for different specializations and levels are available in separate job postings. Please review our consulting specializations to learn more about all opportunities and apply for the position that is best suited to your background and experience: https://careers.mastercard.com/us/en/consulting-specializations-at-mastercard Roles and Responsibilities Client Impact Provide creative input on projects across a range of industries and problem statements Contribute to the development of analytics strategies and programs for regional and global clients by leveraging data and technology solutions to unlock client value Collaborate with Mastercard team to understand clients’ needs, agenda, and risks Develop working relationship with client analysts/managers, and act as trusted and reliable partner Team Collaboration & Culture Collaborate with senior project delivery consultants to identify key findings, prepare effective presentations, and deliver recommendations to clients Independently identify trends, patterns, issues, and anomalies in defined area of analysis, and structure and synthesize own analysis to highlight relevant findings Lead internal and client meetings, and contribute to project management Contribute to the firm's intellectual capital Receive mentorship from performance analytics leaders for professional growth and development Qualifications Basic qualifications Undergraduate degree with data and analytics experience in business intelligence and/or descriptive, predictive, or prescriptive analytics Experience managing clients or internal stakeholders Ability to analyze large datasets and synthesize key findings Proficiency using data analytics software (e.g., Python, R, SQL, SAS) Advanced Word, Excel, and PowerPoint skills Ability to perform multiple tasks with multiple clients in a fast-paced, deadline-driven environment Ability to communicate effectively in English and the local office language (if applicable) Eligibility to work in the country where you are applying, as well as apply for travel visas as required by travel needs Preferred Qualifications Additional data and analytics experience in building, managing, and maintaining database structures, working with data visualization tools (e.g., Tableau, Power BI), or working with Hadoop framework and coding using Impala, Hive, or PySpark Ability to analyze large datasets and synthesize key findings to provide recommendations via descriptive analytics and business intelligence Experience managing tasks or workstreams in a collaborative team environment Ability to identify problems, brainstorm and analyze answers, and implement the best solutions Relevant industry expertise Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines.

Posted 3 days ago

Apply

2.0 years

0 Lacs

India

On-site

At H1, we believe access to the best healthcare information is a basic human right. Our mission is to provide a platform that can optimally inform every doctor interaction globally. This promotes health equity and builds needed trust in healthcare systems. To accomplish this our teams harness the power of data and AI-technology to unlock groundbreaking medical insights and convert those insights into action that result in optimal patient outcomes and accelerates an equitable and inclusive drug development lifecycle. Visit h1.co to learn more about us. As a Software Engineer on the search Engineering team you will support and develop the search infrastructure of the company. This involves working with TB’s of data, indexing, ranking and retrieval of medical data to support the search in backend infra. What You'll Do At H1 The Search Engineering team is responsible for developing and maintaining the company's core search infrastructure. Our objective is to enable fast, accurate, and scalable search across terabytes of medical data. This involves building systems for efficient data ingestion, indexing, ranking, and retrieval that power key product features and user experiences. As a Software Engineer on the Search Engineering team, your day typically includes: Working with our search infrastructure – writing and maintaining code that ingests large-scale data in Elasticsearch. Designing and implementing high-performance APIs that serve search use cases with low latency. Building and maintaining end-to-end features using Node.js and GraphQL, ensuring scalability and maintainability. Collaborating with cross-functional teams – including product managers and data engineers to align on technical direction and deliver impactful features to our users. Take ownership of the search codebase–proactively debug, troubleshoot, and resolve issues quickly to ensure stability and performance. Consistently produce simple, elegant designs and write high-quality, maintainable code that can be easily understood and reused by teammates. Demonstrate a strong focus on performance optimization, ensuring systems are fast, efficient, and scalable. Communicate effectively and collaborate across teams in a fast-paced, dynamic environment. Stay up to date with the latest advancements in AI and search technologies, identifying opportunities to integrate cutting-edge capabilities into our platforms. About You You bring strong hands-on technical skills and experience in building robust backend APIs. You thrive on solving complex challenges with innovative, scalable solutions and take pride in maintaining high code quality through thorough testing.You are able to align your work with broader organizational goals and actively contribute to strategic initiatives. You proactively identify risks and propose solutions early in the project lifecycle to avoid downstream issues.You are curious, eager to learn, and excited to grow in a collaborative, high-performing engineering team environment. Requirements 1–2 years of professional experience. Strong programming skills in TypeScript, Node.js, and Python (Mandatory) Practical experience with Docker and Kubernetes Good to have: Big Data technologies (e.g., Scala, Hadoop, PySpark), Golang, GraphQL, Elasticsearch, and LLMs Not meeting all the requirements but still feel like you’d be a great fit? Tell us how you can contribute to our team in a cover letter! H1 OFFERS Full suite of health insurance options, in addition to generous paid time off Pre-planned company-wide wellness holidays Retirement options Health & charitable donation stipends Impactful Business Resource Groups Flexible work hours & the opportunity to work from anywhere The opportunity to work with leading biotech and life sciences companies in an innovative industry with a mission to improve healthcare around the globe

Posted 3 days ago

Apply

175.0 years

0 Lacs

Gurugram, Haryana, India

On-site

hackajob is collaborating with American Express to connect them with exceptional tech professionals for this role. At American Express, our culture is built on a 175-year history of innovation, shared values and Leadership Behaviors, and an unwavering commitment to back our customers, communities, and colleagues. As part of Team Amex, you'll experience this powerful backing with comprehensive support for your holistic well-being and many opportunities to learn new skills, develop as a leader, and grow your career. Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express. Join Team Amex and let's lead the way together. From building next-generation apps and microservices in Kotlin to using AI to help protect our franchise and customers from fraud, you could be doing entrepreneurial work that brings our iconic, global brand into the future. As a part of our tech team, we could work together to bring ground-breaking and diverse ideas to life that power our digital systems, services, products and platforms. If you love to work with APIs, contribute to open source, or use the latest technologies, we’ll support you with an open environment and learning culture. Function Description American Express is looking for energetic, successful and highly skilled Engineers to help shape our technology and product roadmap. Our Software Engineers not only understand how technology works, but how that technology intersects with the people who count on it every day. Today, innovative ideas, insight and new points of view are at the core of how we create a more powerful, personal and fulfilling experience for our customers and colleagues, with batch/real-time analytical solutions using ground-breaking technologies to deliver innovative solutions across multiple business units. This Engineering role is based in our Global Risk and Compliance Technology organization and will have a keen focus on platform modernization, bringing to life the latest technology stacks to support the ongoing needs of the business as well as compliance against global regulatory requirements. Qualifications Support the Compliance and Operations Risk data delivery team in India to lead and assist in the design and actual development of applications. Responsible for specific functional areas within the team, this involves project management and taking business specifications. The individual should be able to independently run projects/tasks delegated to them. Technology Skills Bachelor degree in Engineering or Computer Science or equivalent 2 to 5 years experience is required GCP professional certification - Data Engineer Expert in Google BigQuery tool for data warehousing needs. Experience on Big Data (Spark Core and Hive) preferred Familiar with GCP offerings, experience building data pipelines on GCP a plus Hadoop Architecture, having knowledge on Hadoop, Map Reduce, Hbase. UNIX shell scripting experience is good to have Creative problem solving (Innovative) Benefits We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations.

Posted 3 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Director, BizOps Overview BizOps team is looking for a dynamic and inspiring People Manager to lead and grow a high-performing team of engineers focused on Site Reliability Engineering (SRE), DevOps, and platform resilience. This role is ideal for a passionate leader who thrives on challenging the status quo, motivating teams, and driving operational excellence at scale. Experience in working across development, operations, and product teams to prioritize needs and to build relationships is a must. Role The role of business operations is to be the production readiness steward for the platform. This is accomplished by closely partnering with developers to design, build, implement, and support technology services. A business operations engineer will ensure operational criteria like system availability, capacity, performance, monitoring, self-healing, and deployment automation are implemented throughout the delivery process. Business Operations plays a key role in leading the production support and DevOps transformation at Mastercard through our tooling and by being an advocate for change and standards throughout the development, quality, release, and product organizations. We accomplish this transformation through supporting daily operations with a hyper focus on triage and then root cause by understanding the business impact of our products. The goal of every biz ops team is to shift left to be more proactive and upfront in the development process, and to proactively manage production and change activities to maximize customer experience, and increase the overall value of supported applications. Biz Ops teams also focus on risk management by tying all our activities together with an overarching responsibility for compliance and risk mitigation across all our environments. A biz ops focus is also on streamlining and standardizing traditional application specific support activities and centralizing points of interaction for both internal and external partners by communicating effectively with all key stakeholders. Ultimately, the role of biz ops is to align Product and Customer Focused priorities with Operational needs. We regularly review our run state not only from an internal perspective, but also understanding and providing the feedback loop to our development partners on how we can improve the customer experience of our applications. All About You As a People Manager, you will be responsible for guiding and enabling a talented technical team who are accountable for the following areas: Technical Leadership Lead and support the full lifecycle of service development—from concept and design to deployment, operations, and continuous improvement. Drive system reliability and operational excellence by analyzing ITSM activities and identifying feedback loops for development teams. Ensure systems are well-designed for reliability, including participation in capacity planning and launch readiness reviews. Maintain production health by monitoring availability, latency, and performance, with a focus on sustainable scalability through automation. Champion CI/CD best practices, owning and evolving the pipeline used to promote application code across environments. Lead incident management processes with a focus on sustainable, blameless postmortems and holistic problem-solving. Promote a culture of continuous learning, proactive monitoring, and operational readiness. DevOps & Automation For team members working within the CI/CD and automation stream: Design and enhance deployment automation using Chef, including writing cookbooks and recipes. Build and maintain robust Jenkins pipelines, integrated with tools such as SonarQube, Maven, Artifactory, and more. Manage code deployments across multiple environments with a strong emphasis on automation and reliability. Design Git-based code management strategies that support parallel environment releases, including branch, version, and promotion automation. People Management Inspire, mentor, and develop engineers at all levels, fostering a high-trust, high-performance culture. Provide technical and career guidance to ensure both individual growth and team success. Collaborate with global stakeholders across time zones and tech hubs to align on delivery priorities and platform improvements. Advocate for best practices in DevOps, observability, and SRE, aligned with Mastercard’s strategic goals. Support knowledge sharing, onboarding, and capability uplift across the team Qualifications BS degree in Computer Science or related technical field involving coding (e.g., physics or mathematics), or equivalent practical experience. Experience with algorithms, data structures, scripting, pipeline management, and software design. Systematic problem-solving approach, coupled with strong communication skills and a sense of ownership and drive. Ability to help debug and optimize code and automate routine tasks. We support many different stakeholders. Experience in dealing with difficult situations and making decisions with a sense of urgency is needed. Experience in one or more of the following is preferred: Oracle DB or Hadoop , .Net or Java, Python, Go, Perl or Ruby.Basic understanding of Cloud platform - AWS/Azure/PCF Interest in designing, analyzing and troubleshooting large-scale distributed systems. We need team members with an appetite for change and pushing the boundaries of what can be done with automation. Experience in working across development, operations, and product teams to prioritize needs and to build relationships is a must. For work on our dev ops team, engineer with experience in industry standard CI/CD tools like Git/BitBucket, Jenkins, Maven, Artifactory, and Chef. Experience designing and implementing an effective and efficient CI/CD flow that gets code from dev to prod with high quality and minimal manual effort is required. " Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines.

Posted 3 days ago

Apply

1.0 - 4.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Description Some careers have more impact than others. If you’re looking for a career where you can make a real impression, join HSBC and discover how valued you’ll be. HSBC is one of the largest banking and financial services organizations in the world, with operations in 62 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Decision Science junior Analyst. Principal Responsibilities To support the Business by providing vital input for strategic planning by the senior management which enables effective decision making along with addressing unforeseen challenges. The team leverages the best of data and analytics capabilities to enable smarter decisions and drive profitable growth. The team supports various domains ranging from Regulatory, Operations, Procurement, Human Resources, and Financial Crime Risk. It provides support to various business groups and the job involves data analysis, model and strategy development & implementation, Business Intelligence, reporting and data management The team addresses range of business problems which cover areas of business growth, improving customer experience, limiting risk exposure, capital quantification, enhancing internal business processes etc. Proactively identify key emerging compliance risks across all RC categories and interface appropriately with other RC teams and senior management. To provide greater understanding of the potential impact and associated consequences / failings of significant new or emerging risks. & provide innovative and effective solutions based on SME knowledge that assists the Business / Function. Proposing, managing and tracking the resolution of subsequent risk management actions. Lead cross-functional projects using advanced data modelling and analysis techniques to discover insights that will guide strategic decisions and uncover optimization opportunities. Against this period of considerable regulatory change and development, and as regulators develop their own understanding of compliance risk management, the role holder must maintain a strong knowledge and understanding of regulatory development and the evolution of the compliance risk framework, risk appetite and risk assessment methodology. Deliver repeatable and scalable analytics through the semi-automation of L1 Financial Crime Risk and Regulatory Compliance Risk Assurance controls testing. Here, Compliance Assurance will develop and run analytics on data sets which will contain personal information such as customer and employee data. Requirements Bachelor’s degree from reputed university in statistics, economics or any other quantitative fields. Fresher with educational background relevant in Data Science or certified in Data science courses 1-4 years of Experience in the field of Automation & Analytics Worked on Proof of Concept or Case study solving complex business problems using data Strong analytical skills with business analysis experience or equivalent. Basic knowledge and understanding of financial-services/ banking-operations is a good to have. Delivery focused, demonstrating an ability to work under pressure and within tight deadlines Basic knowledge of working in Python and other Data Science Tools & in visualization tools such as QlikSense/Other visualization tools. Experience in SQL/ETL tools is an added advantage. Understanding of big data tools: Teradata, Hadoop, etc & adopting cloud technologies like GCP/AWS/Azure is good to have Experience in data science and other machine learning algorithms (For e.g.- Regression, Classification) is an added advantage Basic knowledge in Data Engineering skills – Building data pipelines using modern tools / libraries (Spark or similar). You’ll achieve more at HSBC HSBC is an equal opportunity employer committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and, opportunities to grow within an inclusive and diverse environment. We encourage applications from all suitably qualified persons irrespective of, but not limited to, their gender or genetic information, sexual orientation, ethnicity, religion, social status, medical care leave requirements, political affiliation, people with disabilities, color, national origin, veteran status, etc., We consider all applications based on merit and suitability to the role.” Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued By HSBC Electronic Data Processing (India) Private LTD***

Posted 3 days ago

Apply

6.0 - 8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Roles and responsibilities: Design and implement data pipelines for supply chain data (e.g., inventory, shipping, procurement). Develop and maintain data warehouses and data lakes.  Ensure data quality, integrity, and security. Collaborate with supply chain stakeholders to identify analytics requirements. Develop data models and algorithms for predictive analytics (e.g., demand forecasting, supply chain optimization). Implement data visualization tools (e.g., Tableau, Power BI). Integrate data from various sources (e.g., ERP, PLMs, other data sources). Develop APIs for data exchange.  Work with cross-functional teams (e.g., supply chain, logistics, IT). Communicate technical concepts to non-technical stakeholders. Experience with machine learning algorithms & concepts Knowledge of data governance and compliance. Strong problem-solving and analytical skills. Excellent communication and collaboration skills. Ability to work in a fast-paced environment. Technical Skills: Bachelor's degree in Computer Science, Information Technology, or related field. 6-8 years of experience in data engineering. Proficiency in: Programming languages - Python, Java, SQL, Spark SQL. Data technologies - Hadoop, PySpark, NoSQL databases. Data visualization tools - Qliksense, Tableau, Power BI Cloud platforms - Azure Data Factory, Azure Databricks, AWS

Posted 3 days ago

Apply

5.0 years

0 Lacs

India

On-site

🌟 Distinguished Tech Innovator:3Pillar warmly extends an invitation for you to join an elite team of visionaries. Beyond software development, we are dedicated to engineering solutions that challenge conventional norms. Envision you: steering projects that redefine urban living, establish new media channels for enterprise companies, or drive innovation in healthcare. Your invaluable expertise will serve as the cornerstone in shaping the future direction of our endeavors.This role transcends the ordinary realms of coding; it's about orchestrating technological marvels that disrupt industries. Seize this extraordinary opportunity to lead a team that is actively shaping the tech landscape for our clients, and sets global standards along the way. 🌍🔥 Minimum Qualification 5+ Years of Experience into IT Experience: 4+ years of experience in data engineering, Strong exp in Snowflake. Programming & Scripting: Strong programming skills in Python and Linux Bash for automation and data workflows. Framework Proficiency: Hands-on experience with Luigi for orchestrating complex data workflows. Data Processing & Storage: Expertise in Hadoop ecosystem tools and managing SQL databases for data storage and query optimization. AWS Cloud Services: In-depth knowledge of AWS EC2, S3, RDS, and EMR to deploy and manage data solutions. Monitoring & Alerting Tools: Familiarity with monitoring solutions for real-time tracking and troubleshooting of data pipelines. Communication & Leadership: Proven ability to lead projects, communicate with stakeholders, and guide junior team members. Additional Experience Data Architecture: Experience designing or optimizing data lake solutions. Security Practices: Understanding of data security practices, data governance, and compliance for secure data processing. Automation & CI/CD: Familiarity with CI/CD tools to support automation of deployment and testing. Big Data Technologies: Knowledge of big data processing tools like Spark, Hive, or related AWS services. Advanced Analytics: Background in analytics or data science to contribute to more data-driven decision-making. Cross-Functional Collaboration: Experience collaborating with non-technical teams on business goals and technical solutions.

Posted 3 days ago

Apply

10.0 years

0 Lacs

Vijayawada, Andhra Pradesh, India

On-site

Role Overview: We are seeking a highly motivated and experienced US IT Recruitment Manager to join our team. This role involves managing the entire recruitment process for our US-based IT positions, ensuring that we attract, screen, and onboard the best talent. The ideal candidate will have a strong background in technical recruiting, excellent communication skills, and the ability to work effectively in a fast-paced environment. Key Responsibilities: Full Life Cycle Recruitment: Manage the end-to-end recruitment process, including sourcing, pre-screening, interviewing, shortlisting, and onboarding candidates. Obtain required information and documents, arrange client interviews, and handle pre- and post-interview follow-ups. Prepare and deliver offers to selected candidates. Team Management: Oversee the daily operations of the recruitment team. Distribute workload evenly among recruiters and ensure high levels of motivation and performance. Candidate Screening: Screen candidates for availability, interest level, eligibility to work in the US, salary expectations, relocation needs, technical skills, and core competencies. Utilize various job portals like Monster, Career Builder, Dice, Techfetch, LinkedIn, and internal ATS systems. Client and Candidate Relations: Develop and maintain strong working relationships with internal team members, staffing vendors, professional organizations, and reputable independent consultants. Build rapport with candidates throughout the interview process to ensure a positive experience. Coordinate with Account Managers to assess candidates' competencies and extend job offers. Recruitment Expertise: Recruit for both full-time and contract positions for direct and indirect clients. Negotiate salaries and hourly rates for various terms (W2, 1099, Corp 2 Corp). Verify work authorization documents (H1B, GC-EAD, TN, etc.). Technical Recruitment: Work on requisitions for positions such as Performance Engineer, DevOps Engineer, Technical Project Manager, Java, .Net, Front End, Back End, UX, Full Stack, ETL, Oracle, Business Intelligence, Hadoop, Informatica, Network, Cyber Security, etc. Innovation and Compliance: Regularly update the internal database with new resumes. Adhere to established processes and procedures for a smooth workflow. Bring innovative recruiting techniques to the table. Educational and Professional Experience: Proven work experience as a Technical Recruiter (10+ years). Hands-on experience with various interview formats. Technical expertise with the ability to understand and explain job requirements for IT roles. Familiarity with Applicant Tracking Systems and resume databases. Solid knowledge of sourcing techniques (e.g., social media recruiting and Boolean search). Excellent verbal and written communication skills. Solid understanding of HR practices and labor legislation. MBA in Human Resources Management with a technical background. Required Skills: Minimum 10-15 years of experience in US IT Recruitment. Excellent sourcing skills, headhunting, networking, and social media recruitment. Expertise in recruiting Citizens, Green Card holders, EADs, and visa consultants. Well-versed in US terms and employment laws. Exceptional team player with excellent communication, negotiation, problem-solving, and client interaction skills. Willingness to work night shifts to align with US time zones.

Posted 3 days ago

Apply

10.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Description: Overview (Bank overview, GBS India overview, Function Overview)* Bank of America is one of the world’s leading financial institutions, serving individual consumers, small and middle-market businesses and large corporations with a full range of banking, investing, asset management and other financial and risk management products and services. We are committed to attracting and retaining top talent across the globe to ensure our continued success. Along with taking care of our customers, we want to be the best place for people to work and aim at creating a work environment where all employees have the opportunity to achieve their goals. We are a part of the Global Business Services which delivers technology and operations capabilities to all Bank of America lines of business (LOB) and enterprise functions. Our employees help our customers and clients at every stage of their financial lives, helping them connect to what matters most. This purpose defines and unites us. Every day, we are focused on delivering value, convenience, expertise and innovation for individuals, businesses and institutional investors we serve worldwide. BA Continuum is a nonbank subsidiary of Bank of America, part of Global Business Services in the bank. Process Overview* Global Markets Technology & Operations provides end-to-end technology solutions for Global Markets businesses including Equities, Prime Brokerage, Interest Rates, Currencies, Commodities, Derivatives and Structured Products. Across all these products, solutions include architecture, design, development, change management, implementation and support using various enterprise technologies. In addition, GMT&O provides Sales, Electronic Trading, Trade Work Flow, Pricing, and Market Risk, Middle office, Collateral Management, Credit Risk, Post-trade confirmation, Settlement and Client service processes for Trading, Capital Markets, and Wealth Management businesses. Job Description Post Trade Processing space is a dynamic, cross-functional organization with business analysts, developers and systems engineers, testing professionals and business/technical support professionals. We are seeking for a senior developer to deliver some of the bank-wide initiatives, regulatory reporting and platform stability initiatives in post trade processing space. These are large, distributed enterprise applications and experience with comparable systems is a must. Role expectation is to drive End-to-end business deliveries by closely working with global counterparts. Responsibilities Design and Programming experience with Python or Java. Strong in OOPs and design patterns Messaging with JMS, TIBCO or other framework. Experience developing high transaction/volume processing applications. Capital markets business knowledge, including derivatives. Experience in building support tools for production support and provide assistance as needed. Ability to work in fast development environment and quickly adapt to the changes. Experience/ Knowledge in Agile development methodology Test Driven Development Experience or knowledge in Object Oriented Database A proactive approach to problem solving and think innovatively Primary Skills Required - Python, React JS, Data analysis, Object oriented programming Desired Skills - Python, Hadoop, Impala, Shell scripting, Any RDBMS or Object Oriented Database Preferred Qualifications: MCA/B.E./B.Tech/M.E./M.Tech Interpersonal Skill: Superior verbal and written communication skills a must. Should be proactive, have sense of ownership and have the ability to work independently. Experience: 7 -10 years of relevant experience. Work Timings: 11 AM to 8 PM Job Location: Chennai/Mumbai The below fields are for internal use only (Do not copy the below details on Workday) List of Process / Business with best suited profile fitment for the role* GBAMT - FICC Requisition ID* To be filled by the TA Partner Role Type* Individual Contributor Sub Band* 6B Segment Type* IT Location* Chennai/Mumbai Job Description: This job is responsible for developing and delivering complex requirements to accomplish business goals. Key responsibilities of the job include ensuring that software is developed to meet functional, non-functional and compliance requirements, coding solutions, unit testing, and ensuring the solution can be integrated successfully into the overall application/system with clear, robust, and well-tested interfaces. Job expectations include an awareness of development and testing practices in the industry. Responsibilities: Codes solutions and unit test to deliver a requirement/story per the defined acceptance criteria and compliance requirements Utilizes multiple architectural components (across data, application, business) in design and development of client requirements Performs Continuous Integration and Continuous Development (CI-CD) activities Contributes to story refinement and definition of requirements Participates in estimating work necessary to realize a story/requirement through the delivery lifecycle Contributes to existing test suites (integration, regression, performance), analyze test reports, identify any test issues/errors, and triage the underlying cause Performs spike/proof of concept as necessary to mitigate risk or implement new ideas Skills: Application Development Automation Collaboration DevOps Practices Solution Design Agile Practices Architecture Result Orientation Solution Delivery Process User Experience Design Analytical Thinking Data Management Risk Management Technical Strategy Development Test Engineering

Posted 3 days ago

Apply

2.0 - 4.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Summary As a Data Analyst, you will be responsible for Design, develop, and maintain efficient and scalable data pipelines for data ingestion, transformation, and storage. About The Role Location – Hyderabad Hybrid About The Role: As a Data Analyst, you will be responsible for Design, develop, and maintain efficient and scalable data pipelines for data ingestion, transformation, and storage. Key Responsibilities: Design, develop, and maintain efficient and scalable data pipelines for data ingestion, transformation, and storage. Collaborate with cross-functional teams, including data analysts, business analyst and BI, to understand data requirements and design appropriate solutions. Build and maintain data infrastructure in the cloud, ensuring high availability, scalability, and security. Write clean, efficient, and reusable code in scripting languages, such as Python or Scala, to automate data workflows and ETL processes. Implement real-time and batch data processing solutions using streaming technologies like Apache Kafka, Apache Flink, or Apache Spark. Perform data quality checks and ensure data integrity across different data sources and systems. Optimize data pipelines for performance and efficiency, identifying and resolving bottlenecks and performance issues. Collaborate with DevOps teams to deploy, automate, and maintain data platforms and tools. Stay up to date with industry trends, best practices, and emerging technologies in data engineering, scripting, streaming data, and cloud technologies Essential Requirements: Bachelor's or Master's degree in Computer Science, Information Systems, or a related field with an overall experience of 2-4 Years. Proven experience as a Data Engineer or similar role, with a focus on scripting, streaming data pipelines, and cloud technologies like AWS, GCP or Azure. Strong programming and scripting skills in languages like Python, Scala, or SQL. Experience with cloud-based data technologies, such as AWS, Azure, or Google Cloud Platform. Hands-on experience with streaming technologies, such as AWS Streamsets, Apache Kafka, Apache Flink, or Apache Spark Streaming. Strong experience with Snowflake (Required) Proficiency in working with big data frameworks and tools, such as Hadoop, Hive, or HBase. Knowledge of SQL and experience with relational and NoSQL databases. Familiarity with data modelling and schema design principles. Strong problem-solving skills and the ability to work in a fast-paced, collaborative environment. Excellent communication and teamwork skills. Commitment To Diversity And Inclusion: Novartis is committed to building an outstanding, inclusive work environment and diverse teams' representative of the patients and communities we serve. Accessibility And Accommodation: Novartis is committed to working with and providing reasonable accommodation to individuals with disabilities. If, because of a medical condition or disability, you need a reasonable accommodation for any part of the recruitment process, or in order to perform the essential functions of a position, please send an e-mail to diversityandincl.india@novartis.com and let us know the nature of your request and your contact information. Please include the job requisition number in your message Why Novartis: Helping people with disease and their families takes more than innovative science. It takes a community of smart, passionate people like you. Collaborating, supporting and inspiring each other. Combining to achieve breakthroughs that change patients’ lives. Ready to create a brighter future together? https://www.novartis.com/about/strategy/people-and-culture Join our Novartis Network: Not the right Novartis role for you? Sign up to our talent community to stay connected and learn about suitable career opportunities as soon as they come up: https://talentnetwork.novartis.com/network Benefits and Rewards: Read our handbook to learn about all the ways we’ll help you thrive personally and professionally: https://www.novartis.com/careers/benefits-rewards

Posted 3 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Experience :5-8 yrs Responsible for building agentic workflows using modern LLM orchestration framework to automate and optimize xomplex business process in the Travel domain. Individual contributor (IC), owning end to end development of intelligent agents and services that power customer experiences, recommendations and backend automation. Design and implement agentic and autonomous workflows using framework such as LangGraph, LangChain and CrewAI. Translate business problems in the Travel domain into intelligent LLM powered workflows. Own at least two AI use case implementation from design to production deployment. Build and expose RESTFul and GraphQL APIs to support internal and external consumers. Develop and maintain robust Python based microservices using FastAPI or Django. Collaborate with product managers, data engineers and backend teams to design seamless AI driven user experience. Deploy and maintain workflow and APIs on AWS with best practices in scalability and security. Nice to have Experience in a Big Data technologies (Hadoop, Terradata, Snowflake, Spark, Redshift, Kafka, etc.) for Data Processing. Experience with data management process on AWS is a huge Plus. AWS certification Hand on experience building applications with LangGraph, LangChain and CrewAI. Experience working with AWS services - Lambda, API Gateway, S3, ECS, DynamoDB Experience and Proven track record of implementing at least two AI / LLM based use cases in production. Strong problem solving skills with the ability to deconstract complex problems into actionable AI workflows Experience building scalable, production-grade APIs using FastAPI or Django Strong command over Python and software engineering best practices Solid understanding of multithreading, IO operations, and scalability patterns in backend systems

Posted 3 days ago

Apply

10.0 years

0 Lacs

Borivali, Maharashtra, India

On-site

Job Description: Overview (Bank overview, GBS India overview, Function Overview)* Bank of America is one of the world’s leading financial institutions, serving individual consumers, small and middle-market businesses and large corporations with a full range of banking, investing, asset management and other financial and risk management products and services. We are committed to attracting and retaining top talent across the globe to ensure our continued success. Along with taking care of our customers, we want to be the best place for people to work and aim at creating a work environment where all employees have the opportunity to achieve their goals. We are a part of the Global Business Services which delivers technology and operations capabilities to all Bank of America lines of business (LOB) and enterprise functions. Our employees help our customers and clients at every stage of their financial lives, helping them connect to what matters most. This purpose defines and unites us. Every day, we are focused on delivering value, convenience, expertise and innovation for individuals, businesses and institutional investors we serve worldwide. BA Continuum is a nonbank subsidiary of Bank of America, part of Global Business Services in the bank. Process Overview* Global Markets Technology & Operations provides end-to-end technology solutions for Global Markets businesses including Equities, Prime Brokerage, Interest Rates, Currencies, Commodities, Derivatives and Structured Products. Across all these products, solutions include architecture, design, development, change management, implementation and support using various enterprise technologies. In addition, GMT&O provides Sales, Electronic Trading, Trade Work Flow, Pricing, and Market Risk, Middle office, Collateral Management, Credit Risk, Post-trade confirmation, Settlement and Client service processes for Trading, Capital Markets, and Wealth Management businesses. Job Description Post Trade Processing space is a dynamic, cross-functional organization with business analysts, developers and systems engineers, testing professionals and business/technical support professionals. We are seeking for a senior developer to deliver some of the bank-wide initiatives, regulatory reporting and platform stability initiatives in post trade processing space. These are large, distributed enterprise applications and experience with comparable systems is a must. Role expectation is to drive End-to-end business deliveries by closely working with global counterparts. Responsibilities Design and Programming experience with Python or Java. Strong in OOPs and design patterns Messaging with JMS, TIBCO or other framework. Experience developing high transaction/volume processing applications. Capital markets business knowledge, including derivatives. Experience in building support tools for production support and provide assistance as needed. Ability to work in fast development environment and quickly adapt to the changes. Experience/ Knowledge in Agile development methodology Test Driven Development Experience or knowledge in Object Oriented Database A proactive approach to problem solving and think innovatively Primary Skills Required - Python, React JS, Data analysis, Object oriented programming Desired Skills - Python, Hadoop, Impala, Shell scripting, Any RDBMS or Object Oriented Database Preferred Qualifications: MCA/B.E./B.Tech/M.E./M.Tech Interpersonal Skill: Superior verbal and written communication skills a must. Should be proactive, have sense of ownership and have the ability to work independently. Experience: 7 -10 years of relevant experience. Work Timings: 11 AM to 8 PM Job Location: Chennai/Mumbai The below fields are for internal use only (Do not copy the below details on Workday) List of Process / Business with best suited profile fitment for the role* GBAMT - FICC Requisition ID* To be filled by the TA Partner Role Type* Individual Contributor Sub Band* 6B Segment Type* IT Location* Chennai/Mumbai Job Description: This job is responsible for developing and delivering complex requirements to accomplish business goals. Key responsibilities of the job include ensuring that software is developed to meet functional, non-functional and compliance requirements, coding solutions, unit testing, and ensuring the solution can be integrated successfully into the overall application/system with clear, robust, and well-tested interfaces. Job expectations include an awareness of development and testing practices in the industry. Responsibilities: Codes solutions and unit test to deliver a requirement/story per the defined acceptance criteria and compliance requirements Utilizes multiple architectural components (across data, application, business) in design and development of client requirements Performs Continuous Integration and Continuous Development (CI-CD) activities Contributes to story refinement and definition of requirements Participates in estimating work necessary to realize a story/requirement through the delivery lifecycle Contributes to existing test suites (integration, regression, performance), analyze test reports, identify any test issues/errors, and triage the underlying cause Performs spike/proof of concept as necessary to mitigate risk or implement new ideas Skills: Application Development Automation Collaboration DevOps Practices Solution Design Agile Practices Architecture Result Orientation Solution Delivery Process User Experience Design Analytical Thinking Data Management Risk Management Technical Strategy Development Test Engineering

Posted 3 days ago

Apply

0 years

0 Lacs

India

On-site

Job Summary: We are looking for a skilled Senior Data Engineer with strong expertise in Spark and Scala on the AWS platform. The ideal candidate should possess excellent problem-solving skills and hands-on experience in Spark-based data processing within a cloud-based ecosystem. This role offers the opportunity to independently execute diverse and complex engineering tasks, demonstrate a solid understanding of the end-to-end software development lifecycle, and collaborate effectively with stakeholders to deliver high-quality technical solutions. Key Responsibilities: Develop, analyze, debug, and enhance Spark-Scala programs. Work on Spark batch processing jobs, with the ability to analyze/debug using Spark UI and logs. Optimize performance of Spark applications and ensure scalability and reliability. Manage data processing tasks using AWS S3, AWS EMR clusters, and other AWS services. Leverage Hadoop ecosystem tools including HDFS, HBase, Hive, and MapReduce. Write efficient and optimized SQL queries; experience with PostgreSQL and Couchbase or similar databases is preferred. Utilize orchestration tools such as Kafka, NiFi, and Oozie. Work with monitoring tools like Dynatrace and CloudWatch. Contribute to the creation of High-Level Design (HLD) and Low-Level Design (LLD) documents and participate in reviews with architects. Support development and lower environments setup, including local IDE configuration. Follow defined coding standards, best practices, and quality processes. Collaborate using Agile methodologies for development, review, and delivery. Use supplementary programming languages like Python as needed. Required Skills: Mandatory: Apache Spark Scala Big Data Hadoop Ecosystem Spark SQL Additional Preferred Skills: Spring Core Framework Core Java, Hibernate, Multithreading AWS EMR, S3, CloudWatch HDFS, HBase, Hive, MapReduce PostgreSQL, Couchbase Kafka, NiFi, Oozie Dynatrace or other monitoring tools Python (as supplementary language) Agile Methodology

Posted 3 days ago

Apply

5.0 - 8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Title Data Scientist - Deputy Manager Job Description Job title: Data Scientist - Deputy Manager Your role: Implements solutions to problems using data analysis, data mining, optimization tools and machine learning techniques and statistics Build data-science and technology based algorithmic solutions to address business needs Design large scale models using Regression, Linear Models Family, Time-series models. Drive the collection of new data and the refinement of existing data sources Analyze and interpret the results of analytics experiments Applies a global approach to analytical solutions-both within a business area and across the enterprise Ability to use data for Exploratory, descriptive, Inferential, Prescriptive, and Advanced Analytics Ability to share dashboards, reports, and Analytical insights from data Experience of having done visualization on large datasets – Preferred – added advantage Technical Knowledge and Skills required Experience solving analytical problems using quantitative approaches Passion for empirical research and for answering hard questions with data Ability to manipulate and analyze complex, high-volume, high-dimensionality data from varying sources Ability to apply a flexible analytic approach that allows for results at varying levels of precision Ability to communicate complex quantitative analysis in a clear, precise, and actionable manner Expert knowledge of an analysis tool such as Pyspark and Python. Experience working with large data sets, experience working with distributed computing tools a plus (Map/Reduce, Hadoop, Hive, etc.) Familiarity with relational databases and SQL You're the right fit if: (4 x bullets max) 5 - 8 years of experience with engineering or equivalent background Experience with solving analytical problems using quantitative approaches Ability to manipulate and analyze complex, high-volume, high-dimensionality data from varying sources Ability to apply a flexible analytic approach that allows for results at varying levels of precision Ability to communicate complex quantitative analysis in a clear, precise, and actionable manner Expert knowledge of an analysis tool such as R, Python Experience working with large data sets, experience working with distributed computing tools a plus (Map/Reduce, Hadoop, Hive, etc.) Familiarity with relational databases and SQL How We Work Together We believe that we are better together than apart. For our office-based teams, this means working in-person at least 3 days per week. Onsite roles require full-time presence in the company’s facilities. Field roles are most effectively done outside of the company’s main facilities, generally at the customers’ or suppliers’ locations. Indicate if this role is an office/field/onsite role. About Philips We are a health technology company. We built our entire company around the belief that every human matters, and we won't stop until everybody everywhere has access to the quality healthcare that we all deserve. Do the work of your life to help the lives of others. Learn more about our business. Discover our rich and exciting history. Learn more about our purpose. If you’re interested in this role and have many, but not all, of the experiences needed, we encourage you to apply. You may still be the right candidate for this or other opportunities at Philips. Learn more about our culture of impact with care here.

Posted 3 days ago

Apply

7.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Summary Data Ops Requirement 7+ years of experience working in a software product development organization building modern and scalable big data applications Expertise with functional programming, Python Programming languages : Python (Django,HTML,CSS, Java Script) AI and ML services : Machine learning models Data modeling & engineering (Knowledge of PSQL, Hadoop) Big data analysis (Big data fundamentals with PySpark)

Posted 3 days ago

Apply

3.0 - 4.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field. 3-4 years of hands-on experience in data engineering, with a strong focus on AWS cloud services. Proficiency in Python for data manipulation, scripting, and automation. Strong command of SQL for data querying, transformation, and database management. Demonstrable Experience With AWS Data Services, Including Amazon S3: Data Lake storage and management. AWS Glue: ETL service for data preparation. Amazon Redshift: Cloud data warehousing. AWS Lambda: Serverless computing for data processing. Amazon EMR: Managed Hadoop framework for big data processing (Spark/PySpark experience highly preferred). AWS Kinesis (or Kafka): Real-time data streaming. Strong analytical, problem-solving, and debugging skills. Excellent communication and collaboration abilities, with the capacity to work effectively in an agile team environment. Responsibilities Troubleshoot and resolve data-related issues and performance bottlenecks in existing pipelines. Develop and maintain data quality checks, monitoring, and alerting mechanisms to ensure data pipeline reliability. Participate in code reviews, contribute to architectural discussions, and promote best practices in data engineering.

Posted 3 days ago

Apply

2.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Achieving our goals starts with supporting yours. Grow your career, access top-tier health and wellness benefits, build lasting connections with your team and our customers, and travel the world using our extensive route network. Come join us to create what’s next. Let’s define tomorrow, together. Description United's Digital Technology team designs, develops, and maintains massively scaling technology solutions brought to life with innovative architectures, data analytics, and digital solutions. Find your future at United! We’re reinventing what our industry looks like, and what an airline can be – from the planes we fly to the people who fly them. When you join us, you’re joining a global team of 100,000+ connected by a shared passion with a wide spectrum of experience and skills to lead the way forward. Achieving our ambitions starts with supporting yours. Evolve your career and find your next opportunity. Get the care you need with industry-leading health plans and best-in-class programs to support your emotional, physical, and financial wellness. Expand your horizons with travel across the world’s biggest route network. Connect outside your team through employee-led Business Resource Groups. Create what’s next with us. Let’s define tomorrow together. Job Overview And Responsibilities Data Engineering organization is responsible for driving data driven insights & innovation to support the data needs for commercial and operational projects with a digital focus. Data Engineer will be responsible to partner with various teams to define and execute data acquisition, transformation, processing and make data actionable for operational and analytics initiatives that create sustainable revenue and share growth Design, develop, and implement streaming and near-real time data pipelines that feed systems that are the operational backbone of our business Execute unit tests and validating expected results to ensure accuracy & integrity of data and applications through analysis, coding, writing clear documentation and problem resolution This role will also drive the adoption of data processing and analysis within the Hadoop environment and help cross train other members of the team Leverage strategic and analytical skills to understand and solve customer and business centric questions Coordinate and guide cross-functional projects that involve team members across all areas of the enterprise, vendors, external agencies and partners Leverage data from a variety of sources to develop data marts and insights that provide a comprehensive understanding of the business Develop and implement innovative solutions leading to automation Use of Agile methodologies to manage projects Mentor and train junior engineers This position is offered on local terms and conditions. Expatriate assignments and sponsorship for employment visas, even on a time-limited visa status, will not be awarded. Qualifications What’s needed to succeed (Minimum Qualifications): BS/BA, in computer science or related STEM field 2+ years of IT experience in software development 2+ years of development experience using Java, Python, Scala 2+ years of experience with Big Data technologies like PySpark, Hadoop, Hive, HBASE, Kafka, Nifi 2+ years of experience with relational database systems like MS SQL Server, Oracle, Teradata Creative, driven, detail-oriented individuals who enjoy tackling tough problems with data and insights Individuals who have a natural curiosity and desire to solve problems are encouraged to apply Must be legally authorized to work in India for any employer without sponsorship Must be fluent in English and Hindi (written and spoken) Successful completion of interview required to meet job qualification Reliable, punctual attendance is an essential function of the position What will help you propel from the pack (Preferred Qualifications): Masters in computer science or related STEM field Experience with cloud based systems like AWS, AZURE or Google Cloud Certified Developer / Architect on AWS Strong experience with continuous integration & delivery using Agile methodologies Data engineering experience with transportation/airline industry Strong problem-solving skills Strong knowledge in Big Data

Posted 3 days ago

Apply

8.0 years

0 Lacs

Kochi, Kerala, India

On-site

🚀 We’re Hiring: Big Data Engineer (4–8 Years Experience) 📍 Location: Kochi | 🏢 Mode: On-site | 💼 Employment Type: Full-time Are you passionate about building scalable big data solutions? Do you thrive in a high-performance environment where innovation meets impact? We’re looking for an experienced Big Data Engineer to join our team and help drive our data-driven transformation. You'll design and implement robust data pipelines, optimize distributed systems, and contribute to cutting-edge analytics and ML use cases. 🔧 Key Responsibilities Design and develop scalable big data processing pipelines. Implement data ingestion, transformation, and validation. Collaborate across teams to deliver data solutions for analytics & ML. Optimize systems for performance, reliability, and scalability. Monitor and troubleshoot performance bottlenecks. Document workflows, specs, and technical decisions. 🎓 Required Qualifications Bachelor’s in Computer Science, Engineering, or related field (Master’s preferred). 3+ years of experience in Big Data Engineering. Strong in Python, Java, or Scala. Hands-on with Apache Spark, Hadoop, Kafka, or Flink. Solid knowledge of SQL and relational databases (MySQL, PostgreSQL). Experience with ETL, data modeling, and data warehousing. Exposure to distributed computing and cloud platforms (AWS/GCP/Azure). Familiar with Docker, Kubernetes, and DevOps practices. ⚙️ Tools & Technologies IDEs: IntelliJ, Eclipse Build: Maven, Gradle Testing: JUnit, TestNG, Mockito Monitoring: Prometheus, Grafana, ELK APIs: Swagger, OpenAPI Messaging: Kafka Databases: MySQL, PostgreSQL, MongoDB, Redis ORM: Hibernate, Spring Data 📩 Ready to build the future with us? Apply now or tag someone who fits the role! If anyone interested share your updated resume to vishnu@narrowlabs.in #BigData #DataEngineer #ApacheSpark #Kafka #Hadoop #ETL #HiringNow #KochiJobs #OnsiteOpportunity #DataEngineeringJobs #TechJobsIndia #WeAreHiring #infopark #infoparkKochi #BigDataEngineer #Kochi

Posted 3 days ago

Apply

7.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Job Description Alimentation Couche-Tard Inc., (ACT) is a global Fortune 200 company. A leader in the convenience store and fuel space with over 16,700 stores in 31 countries, serving more than 9 million customers each day. The India Data & Analytics Global Capability Centre is an integral part of ACT’s Global Data & Analytics Team and the Senior Data Scientist will be a key player on this team that will help grow analytics globally at ACT. The hired candidate will partner with multiple departments, including Global Marketing, Merchandising, Global Technology, and Business Units. About The Role The incumbent will be responsible for delivering advanced analytics projects that drive business results including interpreting business, selecting the appropriate methodology, data cleaning, exploratory data analysis, model building, and creation of polished deliverables. Responsibilities Analytics & Strategy Analyse large-scale structured and unstructured data; develop deep-dive analyses and machine learning models in retail, marketing, merchandising, and other areas of the business Utilize data mining, statistical and machine learning techniques to derive business value from store, product, operations, financial, and customer transactional data Apply multiple algorithms or architectures and recommend the best model with in-depth description to evangelize data-driven business decisions Utilize cloud setup to extract processed data for statistical modelling and big data analysis, and visualization tools to represent large sets of time series/cross-sectional data Operational Excellence Follow industry standards in coding solutions and follow programming life cycle to ensure standard practices across the project Structure hypothesis, build thoughtful analyses, develop underlying data models and bring clarity to previously undefined problems Partner with Data Engineering to build, design and maintain core data infrastructure, pipelines and data workflows to automate dashboards and analyses Stakeholder Engagement Working collaboratively across multiple sets of stakeholders – Business functions, Data Engineers, Data Visualization experts to deliver on project deliverables Articulate complex data science models to business teams and present the insights in easily understandable and innovative formats Job Requirements Education Bachelor’s degree required, preferably with a quantitative focus (Statistics, Business Analytics, Data Science, Math, Economics, etc.) Master’s degree preferred (MBA/MS Computer Science/M.Tech Computer Science, etc.) Relevant Experience 5–7 years of relevant working experience in a data science/advanced analytics role Behavioural Skills Delivery Excellence Business disposition Social intelligence Innovation and agility Knowledge Functional Analytics (Supply chain analytics, Marketing Analytics, Customer Analytics) Statistical modelling using Analytical tools (R, Python, KNIME, etc.) and use big data technologies Knowledge of statistics and experimental design (A/B testing, hypothesis testing, causal inference) Practical experience building scalable ML models, feature engineering, model evaluation metrics, and statistical inference. Practical experience deploying models using MLOps tools and practices (e.g., MLflow, DVC, Docker, etc.) Strong coding proficiency in Python (Pandas, Scikit-learn, PyTorch/TensorFlow, etc.) Big data technologies & framework (AWS, Azure, GCP, Hadoop, Spark, etc.) Enterprise reporting systems, relational (MySQL, Microsoft SQL Server etc.), non-relational (MongoDB, DynamoDB) database management systems and Data Engineering tools Business intelligence & reporting (Power BI, Tableau, Alteryx, etc.) Microsoft Office applications (MS Excel, etc.)

Posted 3 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies