Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 8.0 years
0 Lacs
karnataka
On-site
As an Assistant Manager - Analytics, you will play a crucial role in driving data-driven projects, designing complex data solutions, and providing valuable insights to stakeholders to contribute to the growth of our Ads product and business metrics. Your responsibilities will involve gaining deep insights into the Ads core product, leading large-scale experimentation on Adtech innovation, and forecasting demand-supply to drive growth in our Ads product and the complex Ads Entertainment business. You will be part of the Central Analytics team, which is integrated within various business and product teams in a matrix structure to provide comprehensive data insights that drive strategic decisions. This team acts as a strategic enabler for JioHotstar's Ads business and product functions by analyzing consumer experience, consumer supply, advertisers demand, and Ad serving capabilities to achieve goals and KPIs across Ads product, Advertisers objectives, and Entertainment business planning. The team focuses on leveraging experiments, applying GenAI for innovative problem-solving, and building analytical frameworks to guide key decisions and keep teams informed and focused. Reporting to the Manager - Product Analytics, your key responsibilities will include applying analytics knowledge and skills to problem-solving, generating quality data insights through reports, dashboards, and structured documentation, developing a deep understanding of the data platform and technology stack, utilizing statistical techniques to validate findings, effectively communicating complex data concepts, partnering with stakeholders to identify opportunities, managing projects end-to-end, and contributing data-driven insights in experiments to foster a culture of innovation and collaboration. To excel in this role, you should demonstrate expertise in predictive analysis with proficiency in R, SQL, Python, and Pyspark, familiarity with big data platforms and tools like Hadoop, Spark, and Hive, experience in dashboard building and data visualization using tools like Tableau and Power BI, advanced technical skills in collecting and disseminating information accurately, knowledge of digital analytics and clickstream data, strong communication skills for presenting insights clearly, a passion for the entertainment industry, and experience in Adtech and OTT platforms. The ideal candidate will have a Bachelor's or Master's degree in Engineering, Mathematics, Operational Research, Statistics, Physics, or a related technical discipline, along with 4-6 years of experience in Business/Product Analytics, preferably from consumer technology companies. Join us at JioStar, a global media & entertainment company that is revolutionizing the entertainment consumption experience for millions of viewers worldwide. We are committed to diversity and creating an inclusive workplace where everyone can thrive and contribute their unique perspectives.,
Posted 2 weeks ago
15.0 - 23.0 years
0 Lacs
chennai, tamil nadu
On-site
You have a fantastic opportunity as a Test Data Management Practice Manager in Chennai with 15-23 years of experience. Your primary responsibility will be managing Business Intelligence, Test data management, and data-centric testing services. You will need to focus on solutioning and packaging the service offering for account managers and Sales in the Geography. Additionally, you should have experience in BI testing and TDM initiatives. As a Practice Manager for Test Data Management, you will handle managing Test Data Management/data-centric testing projects. You should have a substantial duration of experience in managing BI Testing, ETL Testing, Bigdata Testing, and Hadoop Testing projects. Your experience in pre-sales, providing solutioning, responding to RFPs and proactive proposals, managing bid-related knowledge, case studies, and branding will be crucial for this role.,
Posted 2 weeks ago
18.0 - 22.0 years
0 Lacs
noida, uttar pradesh
On-site
This is a senior leadership position within the Business Information Management Practice, where you will be responsible for the overall vision, strategy, delivery, and operations of key accounts in BIM. You will work closely with the global executive team, subject matter experts, solution architects, project managers, and client teams to conceptualize, build, and operate Big Data Solutions. Your role will involve communicating with internal management, client sponsors, and senior leaders on project status, risks, solutions, and more. As a Client Delivery Leadership Role, you will be accountable for delivering at least $10 M + revenue using information management solutions such as Big Data, Data Warehouse, Data Lake, GEN AI, Master Data Management System, Business Intelligence & Reporting solutions, IT Architecture Consulting, Cloud Platforms (AWS/AZURE), and SaaS/PaaS based solutions. In addition, you will play a crucial Practice and Team Leadership Role, exhibiting qualities like self-driven initiative, customer focus, problem-solving skills, learning agility, ability to handle multiple projects, excellent communication, and leadership skills to coach and mentor staff. As a qualified candidate, you should hold an MBA in Business Management and a Bachelor of Computer Science. You should have 18+ years of prior experience, preferably including at least 5 years in the Pharma Commercial domain, delivering customer-focused information management solutions. Your skills should encompass successful end-to-end DW implementations using technologies like Big Data, Data Management, and BI technologies. Leadership qualities, team management experience, communication skills, and hands-on knowledge of databases, SQL, and reporting solutions are essential. Preferred skills include teamwork, leadership, motivation to learn and grow, ownership, cultural fit, talent management, and capability building/thought leadership. As part of Axtria, a global provider of cloud software and data analytics to the Life Sciences industry, you will contribute to transforming the product commercialization journey to drive sales growth and improve healthcare outcomes for patients. Axtria values technology innovation and offers a transparent and collaborative culture with opportunities for training, career progression, and meaningful work in a fun environment. If you are a driven and experienced professional with a passion for leadership in information management technology and the Pharma domain, this role offers a unique opportunity to make a significant impact and grow within a dynamic and innovative organization.,
Posted 2 weeks ago
0 years
0 Lacs
India
On-site
Job Summary: We are seeking a talented and driven Machine Learning Engineer to design, build, and deploy ML models that solve complex business problems and enhance decision-making capabilities. You will work closely with data scientists, engineers, and product teams to develop scalable machine learning pipelines, deploy models into production, and continuously improve their performance. Key Responsibilities: Design, develop, and deploy machine learning models for classification, regression, clustering, recommendation, NLP, or computer vision tasks. Collaborate with data scientists to prepare and preprocess large-scale datasets for training and evaluation. Implement and optimize machine learning pipelines and workflows using tools like MLflow, Airflow, or Kubeflow. Integrate models into production environments and ensure model performance, monitoring, and retraining. Conduct A/B testing and performance evaluations to validate model accuracy and business impact. Stay up-to-date with the latest advancements in ML/AI research and tools. Write clean, efficient, and well-documented code for reproducibility and scalability. Requirements: Bachelor’s or Master’s degree in Computer Science, Engineering, Mathematics, or a related field. Strong knowledge of machine learning algorithms, data structures, and statistical methods. Proficient in Python and ML libraries/frameworks (e.g., scikit-learn, TensorFlow, PyTorch, XGBoost). Experience with data manipulation libraries (e.g., pandas, NumPy) and visualization tools (e.g., Matplotlib, Seaborn). Familiarity with cloud platforms (AWS, GCP, or Azure) and model deployment tools. Experience with version control systems (Git) and software engineering best practices. Preferred Qualifications: Experience in deep learning, natural language processing (NLP), or computer vision. Knowledge of big data technologies like Spark, Hadoop, or Hive. Exposure to containerization (Docker), orchestration (Kubernetes), and CI/CD pipelines. Familiarity with MLOps practices and tools.
Posted 2 weeks ago
3.0 - 5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
TransUnion's Job Applicant Privacy Notice What We'll Bring TransUnion is a global information and insights company that makes trust possible in the modern economy. We do this by providing a comprehensive picture of each person so they can be reliably and safely represented in the marketplace. As a result, businesses and consumers can transact with confidence and achieve great things. We call this Information for Good.® A leading presence in more than 30 countries across five continents, TransUnion provides solutions that help create economic opportunity, great experiences and personal empowerment for hundreds of millions of people. What You'll Bring As consultant on our team, you will join a global group of statisticians, data scientists, and industry experts on a mission to extract insights from data and put them to good use. You will have an opportunity to be a part of a variety of analytical projects in a collaborative environment and be recognized for the work you deliver. TransUnion offers a culture of lifelong learning and as an associate here, your growth potential is limitless. The consultant role within the Research and Consulting team is responsible for delivering market-level business intelligence both to TransUnion’s senior management and to Financial Services customers. You will work on projects across international markets, including Canada, Hong Kong, UK, South Africa, Philippines, and Colombia. To be successful in this position, you must have good organizational skills, a strategic mindset, and a flexible predisposition. You will also be expected to operate independently and able to lead and present projects with minimal supervision. How You’ll Contribute You will develop a strong understanding of consumer credit data and how it applies to industry trends and research across different international markets You will dig in by extracting data and performing segmentation and statistical analyses on large population datasets (using languages such as R, SQL, and Python on Linux and PC computing platforms) You will conduct analyses and quantitative research studies designed to understand complex industry trends and dynamics, leveraging a variety of statistical techniques You will deliver analytic insights and recommendations in succinct and compelling presentations for internal and external customers at various levels including an executive audience; you may lead key presentations to clients You will perform multiple tasks simultaneously and deal with changing requirements and deadlines You will develop strong consulting skills to be able to help external customers by understanding their business needs and aligning them with TransUnion’s product offerings and capabilities You will help to cultivate an environment that promotes excellence, innovation, and a collegial spirit Through all these efforts, you will be a key contributor to driving the perception of TransUnion as an authority on lending dynamics and a worthwhile, trusted partner to our clients and prospects Impact You'll Make What you'll bring: A Bachelor’s or Master’s degree in Statistics, Applied Mathematics, Operations Research, Economics, or an equivalent discipline Minimum 3-5 years of experience in a relevant field, such as data analytics, lending, or risk strategy Advanced proficiency with one or more statistical programming languages such as R Advanced proficiency writing SQL queries for data extraction Experience with big data platforms (e.g. Apache Hadoop, Apache Spark) preferred Advanced experience with the MS Office suite, particularly Word, Excel, and PowerPoint Strong time management skills with the ability to prioritize and contribute to multiple assignments simultaneously Excellent verbal and written communication skills. You must be able to clearly articulate ideas to both technical and non-technical audiences Highly analytical mindset with the curiosity to dig deeper into data, trends, and consumer behavior A strong interest in the areas of banking, consumer lending, and finance is paramount, with a curiosity as to why consumers act the way they do with their credit Strong work ethic with the passion for team success This is a hybrid position and involves regular performance of job responsibilities virtually as well as in-person at an assigned TU office location for a minimum of two days a week. TransUnion Job Title Consultant, Research & Consulting
Posted 2 weeks ago
12.0 - 20.0 years
35 - 40 Lacs
Mumbai
Work from Office
Job Title: Big Data Developer Project Support & Mentorship Location: Mumbai Position Overview: We are seeking a skilled Big Data Developer to join our growing delivery team, with a dual focus on hands-on project support and mentoring junior engineers. This role is ideal for a developer who not only thrives in a technical, fast-paced environment but is also passionate about coaching and developing the next generation of talent. You will work on live client projects, provide technical support, contribute to solution delivery, and serve as a go-to technical mentor for less experienced team members. Key Responsibilities: Perform hands-on Big Data development work, including coding, testing, troubleshooting, and deploying solutions. Support ongoing client projects, addressing technical challenges and ensuring smooth delivery. Collaborate with junior engineers to guide them on coding standards, best practices, debugging, and project execution. Review code and provide feedback to junior engineers to maintain high quality and scalable solutions. Assist in designing and implementing solutions using Hadoop, Spark, Hive, HDFS, and Kafka. Lead by example in object-oriented development, particularly using Scala and Java. Translate complex requirements into clear, actionable technical tasks for the team. Contribute to the development of ETL processes for integrating data from various sources. Document technical approaches, best practices, and workflows for knowledge sharing within the team. Required Skills and Qualifications: 8+ years of professional experience in Big Data development and engineering. Strong hands-on expertise with Hadoop, Hive, HDFS, Apache Spark, and Kafka. Solid object-oriented development experience with Scala and Java. Strong SQL skills with experience working with large data sets. Practical experience designing, installing, configuring, and supporting Big Data clusters. Deep understanding of ETL processes and data integration strategies. Proven experience mentoring or supporting junior engineers in a team setting. Strong problem-solving, troubleshooting, and analytical skills. Excellent communication and interpersonal skills. Preferred Qualifications: Professional certifications in Big Data technologies (Cloudera, Databricks, AWS Big Data Specialty, etc.). Experience with cloud Big Data platforms (AWS EMR, Azure HDInsight, or GCP Dataproc). Exposure to Agile or DevOps practices in Big Data project environments. What We Offer: Opportunity to work on challenging, high-impact Big Data projects. Leadership role in shaping and mentoring the next generation of engineers. Supportive and collaborative team culture. Flexible working environment Competitive compensation and professional growth opportunities.
Posted 2 weeks ago
3.0 - 5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
TransUnion's Job Applicant Privacy Notice What We'll Bring TransUnion is a global information and insights company that makes trust possible in the modern economy. We do this by providing a comprehensive picture of each person so they can be reliably and safely represented in the marketplace. As a result, businesses and consumers can transact with confidence and achieve great things. We call this Information for Good.® A leading presence in more than 30 countries across five continents, TransUnion provides solutions that help create economic opportunity, great experiences and personal empowerment for hundreds of millions of people. What You'll Bring As consultant on our team, you will join a global group of statisticians, data scientists, and industry experts on a mission to extract insights from data and put them to good use. You will have an opportunity to be a part of a variety of analytical projects in a collaborative environment and be recognized for the work you deliver. TransUnion offers a culture of lifelong learning and as an associate here, your growth potential is limitless. The consultant role within the Research and Consulting team is responsible for delivering market-level business intelligence both to TransUnion’s senior management and to Financial Services customers. You will work on projects across international markets, including Canada, Hong Kong, UK, South Africa, Philippines, and Colombia. To be successful in this position, you must have good organizational skills, a strategic mindset, and a flexible predisposition. You will also be expected to operate independently and able to lead and present projects with minimal supervision. How You’ll Contribute You will develop a strong understanding of consumer credit data and how it applies to industry trends and research across different international markets You will dig in by extracting data and performing segmentation and statistical analyses on large population datasets (using languages such as R, SQL, and Python on Linux and PC computing platforms) You will conduct analyses and quantitative research studies designed to understand complex industry trends and dynamics, leveraging a variety of statistical techniques You will deliver analytic insights and recommendations in succinct and compelling presentations for internal and external customers at various levels including an executive audience; you may lead key presentations to clients You will perform multiple tasks simultaneously and deal with changing requirements and deadlines You will develop strong consulting skills to be able to help external customers by understanding their business needs and aligning them with TransUnion’s product offerings and capabilities You will help to cultivate an environment that promotes excellence, innovation, and a collegial spirit Through all these efforts, you will be a key contributor to driving the perception of TransUnion as an authority on lending dynamics and a worthwhile, trusted partner to our clients and prospects Impact You'll Make What you'll bring: A Bachelor’s or Master’s degree in Statistics, Applied Mathematics, Operations Research, Economics, or an equivalent discipline Minimum 3-5 years of experience in a relevant field, such as data analytics, lending, or risk strategy Advanced proficiency with one or more statistical programming languages such as R Advanced proficiency writing SQL queries for data extraction Experience with big data platforms (e.g. Apache Hadoop, Apache Spark) preferred Advanced experience with the MS Office suite, particularly Word, Excel, and PowerPoint Strong time management skills with the ability to prioritize and contribute to multiple assignments simultaneously Excellent verbal and written communication skills. You must be able to clearly articulate ideas to both technical and non-technical audiences Highly analytical mindset with the curiosity to dig deeper into data, trends, and consumer behavior A strong interest in the areas of banking, consumer lending, and finance is paramount, with a curiosity as to why consumers act the way they do with their credit Strong work ethic with the passion for team success This is a hybrid position and involves regular performance of job responsibilities virtually as well as in-person at an assigned TU office location for a minimum of two days a week. TransUnion Job Title Consultant, Research & Consulting
Posted 2 weeks ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Responsible for designing, developing, and optimizing data processing solutions using a combination of Big Data technologies. Focus on building scalable and efficient data pipelines for handling large datasets and enabling batch & real-time data streaming and processing. Responsibilities: > Develop Spark applications using Scala or Python (Pyspark) for data transformation, aggregation, and analysis. > Develop and maintain Kafka-based data pipelines: This includes designing Kafka Streams, setting up Kafka Clusters, and ensuring efficient data flow. > Create and optimize Spark applications using Scala and PySpark: They leverage these languages to process large datasets and implement data transformations and aggregations. > Integrate Kafka with Spark for real-time processing: They build systems that ingest real-time data from Kafka and process it using Spark Streaming or Structured Streaming. > Collaborate with data teams: This includes data engineers, data scientists, and DevOps, to design and implement data solutions. > Tune and optimize Spark and Kafka clusters: Ensuring high performance, scalability, and efficiency of data processing workflows. > Write clean, functional, and optimized code: Adhering to coding standards and best practices. > Troubleshoot and resolve issues: Identifying and addressing any problems related to Kafka and Spark applications. > Maintain documentation: Creating and maintaining documentation for Kafka configurations, Spark jobs, and other processes. > Stay updated on technology trends: Continuously learning and applying new advancements in functional programming, big data, and related technologies. Proficiency in: Hadoop ecosystem big data tech stack(HDFS, YARN, MapReduce, Hive, Impala). Spark (Scala, Python) for data processing and analysis. Kafka for real-time data ingestion and processing. ETL processes and data ingestion tools Deep hands-on expertise in Pyspark, Scala, Kafka Programming Languages: Scala, Python, or Java for developing Spark applications. SQL for data querying and analysis. Other Skills: Data warehousing concepts. Linux/Unix operating systems. Problem-solving and analytical skills. Version control systems ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.
Posted 2 weeks ago
8.0 - 12.0 years
15 - 30 Lacs
Bengaluru
Remote
Kindly find the detailed job Description Shift base Job (Night shift) Exp 8+Yrs Relevant - 5 Yrs in Data Engineer Location - Remote Client - Turing (FTE) If interested kindly share your details to gokul.g@buzzworks.in Skills Core Requirements 8+ years with Linux, Bash, Python, SQL. 4+ years with Spark, Hadoop ecosystem, and team leadership. Strong AWS experience: EMR, Glue, Athena, Redshift. Proven expertise in designing data flows and integration APIs. Passion for solving complex problems using modern tech. Preferred Skills Degree in CS or related field. Python, C++, or similar programming language. Experience with petabyte-scale data, data catalogs (Hive, Glue), and pipeline tools (Airflow, dbt). Familiarity with AWS and/or GCP Total experience - Relevant Exp - Mobile Number - Alternative Mobile Number - Email . ID - Current CTC - Expecting CTC - Notice period - Current location - Preferring location - Date OF Birth - Current Company - Education (Year of completed) - Counter offer (Yes / NO) - Kindly share the updated CV
Posted 2 weeks ago
2.0 - 5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
The Applications Development Intermediate Programmer Analyst is an intermediate level position responsible for participation in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to contribute to applications systems analysis and programming activities. Responsibilities Develop and maintain Java-based applications using the Spring framework. Design and implement batch processing solutions using Spark Batch for large-scale data processing. Build real-time data pipelines using Spark Streaming for processing streaming data. Collaborate with cross-functional teams to define, design, and deliver new features. Optimize data processing workflows for performance, scalability, and reliability. Troubleshoot and resolve issues related to data processing, application performance, and system integration. Write clean, maintainable, and well-documented code following best practices. Participate in code reviews, unit testing, and system testing to ensure quality deliverables. Stay updated with emerging technologies and propose improvements to existing systems. Required Skills and Qualifications Education: Bachelor’s degree in Computer Science, Engineering, or a related field. Experience: 2 to 5 years of professional experience in Java development. Technical Skills: Strong proficiency in Java (version 8 or higher) and object-oriented programming. Hands-on experience with Spring (Spring Boot, Spring MVC, or Spring Data) for building enterprise applications. Expertise in Spark Batch for large-scale data processing and analytics. Experience with Spark Streaming for real-time data processing and streaming pipelines. Familiarity with distributed computing concepts and big data frameworks. Proficiency with version control systems like Git. Knowledge of build tools such as Maven or Gradle. Understanding of Agile/Scrum methodologies. Soft Skills: Strong problem-solving and analytical skills. Excellent communication and teamwork abilities. Ability to manage multiple priorities and work independently. Preferred Skills Experience with big data technologies like Hadoop, Kafka, or Hive. Knowledge of containerization tools like Docker or Kubernetes. Experience with CI/CD pipelines and tools like Jenkins. Understanding of data storage solutions like HDFS Education: Bachelor’s degree/University degree or equivalent experience This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.
Posted 2 weeks ago
0.0 - 2.0 years
0 Lacs
Pune, Maharashtra, India
On-site
The Applications Development Programmer Analyst is an intermediate level position responsible for participation in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to contribute to applications systems analysis and programming activities. Responsibilities: Utilize knowledge of applications development procedures and concepts, and basic knowledge of other technical areas to identify and define necessary system enhancements Identify and analyze issues, make recommendations, and implement solutions Utilize knowledge of business processes, system processes, and industry standards to solve complex issues Analyze information and make evaluative judgements to recommend solutions and improvements Conduct testing and debugging, utilize script tools, and write basic code for design specifications Assess applicability of similar experiences and evaluate options under circumstances not covered by procedures Develop working knowledge of Citi’s information systems, procedures, standards, client server application development, network operations, database administration, systems administration, data center operations, and PC-based applications Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Qualifications: 0-2 years of relevant experience Experience in programming/debugging used in business applications Working knowledge of industry practice and standards Comprehensive knowledge of specific business area for application development Working knowledge of program languages Consistently demonstrates clear and concise written and verbal communication Education: Bachelor’s degree/University degree or equivalent experience This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. Qualifications: 2-4 years of finance technology experience including Java, J2EE, JMS, Docker, Kubernetes, Oracle, SQL, PL/SQL, XML, SOAP UI, Postman, AppDynamics, JIRA, Hadoop, Spring Boot, REST Web Services Citi Ledger and Components subject matter expertise preferred. Experience in implementing complex solutions and projects Experience in systems analysis and programming of all underlying software in CWM and FAEM (Java, Oracle DB, Tibco) , Demonstrated Subject Matter Expert (SME) in CWM, FAEM and finance adjustments. Demonstrated knowledge of client core business functions Demonstrated leadership, project management, and development skills Relationship and consensus building skills ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.
Posted 2 weeks ago
12.0 years
0 Lacs
Pune, Maharashtra, India
On-site
The Applications Development Technology Lead Analyst is a senior level position responsible for establishing and implementing new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to lead applications systems analysis and programming activities. Responsibilities Design, develop, and maintain backend applications and services, implementing enterprise-grade, large-scale data processing pipelines using Java and Spark Build and optimize RESTful APIs to support front-end and third-party integrations. Implement business logic, data processing, and system integrations. Work with both SQL and NoSQL databases to ensure efficient data storage and retrieval. Collaborate with front-end developers, product managers, and other stakeholders to deliver high-quality solutions. Ensure the performance, scalability, and security of backend systems. Identify and resolve technical issues, bottlenecks, and bugs. Write clean, maintainable, and well-documented code following best practices. Participate in code reviews, unit testing, and system testing to ensure quality deliverables. Stay updated with emerging backend technologies and apply them to improve systems. Required Skills And Qualifications Education: Bachelor’s degree in Computer Science, Engineering, or a related field. Experience: 12+ years of professional experience in Java backend development. Technical Skills: Strong proficiency in Java (version 8 or higher) and object-oriented programming. Experience with Spring Framework for building enterprise-grade backend applications. Expertise in designing and developing RESTful APIs with best practices. Proficiency with SQL and big data technologies like Hadoop, Hive, Kafka, or Spark. Experience with version control systems like Git. Knowledge of build tools such as Maven or Gradle. Understanding of Agile/Scrum methodologies. Soft Skills: Strong problem-solving and analytical skills. Excellent communication and teamwork abilities. Ability to manage multiple priorities and work independently. Preferred Skills Familiarity with microservices architecture and design patterns. Knowledge of containerization and orchestration technologies such as Docker, Kubernetes or Openshift. Familiarity with CI/CD pipelines and tools like Jenkins. Experience with database technologies like MySQL, PostgreSQL, MongoDB, or Couchbase. Familiarity with security best practices for backend systems and APIs. Education: Bachelor’s degree/University degree or equivalent experience Master’s degree preferred This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.
Posted 2 weeks ago
8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
The Applications Development Senior Programmer Analyst is an intermediate level position responsible for participation in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to contribute to applications systems analysis and programming activities. Responsibilities Develop and maintain Java-based applications using the Spring framework. Design and implement batch processing solutions using Spark Batch for large-scale data processing. Build real-time data pipelines using Spark Streaming for processing streaming data. Collaborate with cross-functional teams to define, design, and deliver new features. Optimize data processing workflows for performance, scalability, and reliability. Troubleshoot and resolve issues related to data processing, application performance, and system integration. Write clean, maintainable, and well-documented code following best practices. Participate in code reviews, unit testing, and system testing to ensure quality deliverables. Stay updated with emerging technologies and propose improvements to existing systems. Required Skills and Qualifications Education: Bachelor’s degree in Computer Science, Engineering, or a related field. Experience: 8+ years of professional experience in Java development. Technical Skills: Strong proficiency in Java (version 8 or higher) and object-oriented programming. Hands-on experience with Spring (Spring Boot, Spring MVC, or Spring Data) for building enterprise applications. Expertise in Spark Batch for large-scale data processing and analytics. Experience with Spark Streaming for real-time data processing and streaming pipelines. Familiarity with distributed computing concepts and big data frameworks. Proficiency with version control systems like Git. Knowledge of build tools such as Maven or Gradle. Understanding of Agile/Scrum methodologies. Soft Skills: Strong problem-solving and analytical skills. Excellent communication and teamwork abilities. Ability to manage multiple priorities and work independently. Preferred Skills Experience with big data technologies like Hadoop, Kafka, or Hive. Knowledge of containerization tools like Docker or Kubernetes. Experience with CI/CD pipelines and tools like Jenkins. Understanding of data storage solutions like HDFS Education: Bachelor’s degree/University degree or equivalent experience This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Requirements Description and Requirements Job Description and Requirements Position Summary The MetLife Corporate Technology (CT) organization is evolving to enable MetLife’s New Frontier strategy. With a strong vision in place, we are a global function focused on driving digital technology strategies for key corporate functions within MetLife including, Finance, Actuarial, Reinsurance, Legal, Human Resources, Employee Experience, Risk, Treasury, Audit and Compliance. In partnership with our business leaders, we develop and deliver seamless technology experiences to our employees across the entire employee lifecycle. Our vision and mission is to create innovative, transformative and contemporary technology solutions to empower our leaders and employees so they can focus on what matters most, our customers. We are technologists with strong business acumen focused on developing our talent to continually transform and innovate. We are seeking a highly motivated and skilled Azure Data Engineer to join our growing team in Hyderabad. This position is perfect for talented professionals with 4-8 years of experience in designing, building, and maintaining scalable cloud-based data solutions. As an Azure Data Engineer at MetLife, you will collaborate with cross-functional teams to enable data transformation, analytics, and decision-making by leveraging Microsoft Azure’s advanced technologies. He/she should be a strategic thinker, an effective communicator, and an expert in technological development. Key Relationships Internal Stake Holder – Key Responsibilities Design, develop, and maintain efficient and scalable data pipelines using Azure Data Factory (ADF) for ETL/ELT processes. Build and optimize data models and data flows in Azure Synapse Analytics, SQL Databases, and Azure Data Lake. Work with large datasets to define, test, and implement data storage, transformation, and processing strategies using Azure-based services. Create and manage data pipelines for ingesting, processing, and transforming data from various sources into a structured format. Develop solutions for real-time and batch processing using tools like Azure Stream Analytics and Event Hubs. Implement data security, governance, and compliance measures to ensure the integrity and accessibility of the organization’s data assets. Contribute to the migration of on-premises databases and ETL processes to Azure cloud. Build processes to identify, monitor, and resolve data inconsistencies and quality issues. Collaborate with data architects, business analysts, and developers to deliver reliable and performant data solutions aligned with business requirements. Monitor and optimize performance and cost of Azure-based data solutions. Document architectures, data flows, pipelines, and implementations for future reference and knowledge sharing. Knowledge, Skills, And Abilities Education A Bachelors/master's degree in computer science or equivalent Engineering degree. Candidate Qualifications: Education: Bachelor's degree in computer science, Information Systems or related field Experience: Required: 4-8 years of experience in data engineering, with a strong focus on Azure-based services. Proficiency in Azure Data Factory (ADF), Azure Synapse Analytics, Azure Data Lake, and Azure SQL Databases. Strong knowledge of data modeling, ETL/ELT processes, and data pipeline design. Hands-on experience with Python, SQL, and Spark for data manipulation and transformation. Exposure to big data platforms like Hadoop, Databricks, or similar technologies. Experience with real-time data streaming using tools like Azure Stream Analytics, Event Hubs, or Service Bus. Familiarity with data governance, best practices, and security protocols within cloud environments. Solid understanding of Azure DevOps for CI/CD pipelines around data workflows. Strong problem-solving skills with attention to detail and a results-driven mindset. Excellent collaboration, communication, and interpersonal skills for working with cross-functional teams. Preferred: Demonstrated experience in end-to-end cloud data warehouse migrations. Familiarity with Power BI or other visualization tools for creating dashboards and reports. Certification in Azure Data Engineer Associate or Azure Solutions Architect is a plus. Understanding of machine learning concepts and integrating AI/ML pipelines is an advantage. Skills and Competencies: Language: Proficiency at business level in English. Competencies: Communication: Ability to influence and help communicate the organization’s direction and ensure results are achieved Collaboration: Proven track record of building collaborative partnerships and ability to operate effectively in a global environment Diverse environment: Can-do attitude and ability to work in a high paced environment Tech Stack Development & Delivery Methods: Agile (Scaled Agile Framework) DevOps and CI/CD: Azure DevOps Development Frameworks and Languages: SQL Spark Python Azure: Functional Knowledge of cloud based solutions About MetLife Recognized on Fortune magazine's list of the 2025 "World's Most Admired Companies" and Fortune World’s 25 Best Workplaces™ for 2024, MetLife , through its subsidiaries and affiliates, is one of the world’s leading financial services companies; providing insurance, annuities, employee benefits and asset management to individual and institutional customers. With operations in more than 40 markets, we hold leading positions in the United States, Latin America, Asia, Europe, and the Middle East. Our purpose is simple - to help our colleagues, customers, communities, and the world at large create a more confident future. United by purpose and guided by empathy, we’re inspired to transform the next century in financial services. At MetLife, it’s #AllTogetherPossible . Join us!
Posted 2 weeks ago
5.0 - 7.0 years
0 Lacs
Andhra Pradesh, India
On-site
Job Description / Responsibilities - 5-7 years of experience in Big Data stacks: Spark/Scala/Hive/Impala/Hadoop Strong Expertise in Scala The resource should have good hands-on experience in Scala programming language . Should be able to model the given problem statement using Object Oriented programming concepts. Should have the basic understanding of the Spark in-memory processing framework and the concept of map tasks and reduce tasks. Should have hands-on experience on data processing projects. Should be able to frame sqls and analyze data based on the given requirements Advanced SQL knowledge Git hub or bit bucket Primary Skill Spark Scala. The resource should have good hands-on experience in Scala programming language Secondary Skill SQL, Python, Hive, Impala, AWS
Posted 2 weeks ago
0 years
0 Lacs
Andhra Pradesh, India
On-site
Experience in building Pyspark process. Proficient in understanding distributed computing principles. Experience in managing Hadoop cluster with all services. Experience with Nosql Databases and Messaging systems like Kafka. Designing building installing configuring and supporting Hadoop Perform analysis of vast data stores. Good understanding of cloud technology. Must have strong technical experience in Design Mapping specifications HLD LLD. Must have the ability to relate to both business and technical members of the team and possess excellent communication skills. Leverage internal tools and SDKs, utilize AWS services such as S3, Athena, and Glue, and integrate with our internal Archival Service Platform for efficient data purging. Lead the integration efforts with the internal Archival Service Platform for seamless data purging and lifecycle management. Collaborate with the data engineering team to continuously improve data integration pipelines, ensuring adaptability to evolving business needs. Develop and maintain data platforms using Pyspark Work with AWS and Big Data, design and implement data pipelines, and ensure data quality and integrity Collaborate with crossfunctional teams to understand data requirements and design solutions that meet business needs Implement and manage agents for monitoring, logging, and automation within AWS environments Handling migration from PySpark to AWS
Posted 2 weeks ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Description Some careers shine brighter than others. If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Sr. Associate Director, Software Engineering In this role, you will: Several years of hands-on experience in software development, proficient in multiple programming languages and frameworks. In-depth knowledge of software design patterns, data structures, algorithms, and system architecture. Conceptualise, design, develop and reuse effective engineering design, patterns & frameworks around the stated architecture strategy, aligning to departmental standards as appropriate. Demonstrated ability to make critical technical decisions and trade-offs while considering project constraints and business objectives. Actively drive continuous improvement, mature the organisation and improve productivity. Focus on automation, process improvement, reuse and transformation towards Agile and DevOps to meet or exceed OKRs for engineering excellence and associated metrics. Proven leadership skills with the ability to guide and inspire a development team, fostering a positive and productive work environment. Strong teamwork and collaboration skills to work effectively with cross-functional teams. A commitment to delivering high-quality software through effective solution design, code reviews, testing, and adherence to best practices. A passion for innovation, driving the adoption of new technologies and methodologies to improve software development processes and product offerings. Ensure compliance to end to end controls for the product and data, including effective risk and control management inclusive of non-financial risks, compliance and conduct responsibilities. Adhere to HSBC standard processes Requirements To be successful in this role, you should meet the following requirements: Proven track record of technology leadership and delivery across Databases, Bigdata (e.g. Hadoop), Cloud (e.g. GCP, Azure), Business Intelligence (e.g. Qlik, TM1) and Workflow (e.g. Appian) platforms. Experience on cloud platforms such as Google Cloud Platform (GCP), Amazon Web Services (AWS), Microsoft Azure, or On-Prem Cloud Platforms, and knowledge of deploying and scaling applications in the cloud. Proficiency in one or more programming languages such as Java, Python, Golang or JavaScript Strong knowledge and experience in designing scalable, maintainable, and modular software architectures. Expertise in web development technologies like HTML, CSS, JavaScript, and relevant frameworks (e.g., React, Angular, Vue.js). Experience in building robust and efficient backend systems using frameworks like Spring, Django, Express.js Proficiency in database design, optimization, and query optimization with SQL databases and/or NoSQL databases Understanding and experience in designing and implementing microservices-based architectures. Knowledge of containerization technologies like Docker and container orchestration platforms like Kubernetes. Experience with various software testing methodologies and tools for unit testing, integration testing, and end-to-end testing. Awareness of secure coding practices and experience in implementing security measures to protect against vulnerabilities and threats. You’ll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by – HSBC Software Development India
Posted 2 weeks ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Description Some careers shine brighter than others. If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Senior Software Engineer In this role, you will: Strong experience of working in an Agile & DevOps environment including expert knowledge of scrum management tools (e.g. Jira) Demonstrable technical leadership and teamwork skills. A successful track record of delivering complex projects and/or programmes, utilising appropriate techniques and tools to ensure and measure success Experience of delivering solutions that align to approved design patterns and security standards Risk management experience monitors, identifies, and develops action plans to remediate risks Experience of operating in a large scale and highly regulated industry (e.g. financial services) Requirements To be successful in this role, you should meet the following requirements: CI/CD tools such as Jenkins, Git, GitHub, Nexus Databases - Mongo, Dynamo and DocumentDB, Hadoop and familiar with SQL queries Python, Java, Spring Boot, Maven REST APIs / JSON Node JS (Express & NPM) & React (ES6+), Redux Pivotal Cloud Foundry, Mule API Gateway, Docker Cloud Platforms – AWS Cloud Application Monitoring - Splunk / App Dynamics Automation - Test Automation tooling (Selenium, JUnit, Wiremock, Mockito, Jest, Enzyme) and Automated Scripting Large scale networking, load balancing, F5/Bluecoat, proxies, managing ssl certs Agile Methodologies - Scrum, Kanban, Pair Programming, SAFe Agile Tooling - Jira, Confluence, Slack In addition to the details listed above, the ideal candidate will: Be an approachable and supportive team member with a collaborative attitude within a demanding, maturing Agile environment Be able to communicate effectively – spoken and written – to convey complex technical subject matter clearly, adapting to the audience. Knowledge of HSBC and first direct systems will be a distinct advantage.
Posted 2 weeks ago
6.0 - 12.0 years
6 - 9 Lacs
Bengaluru
Work from Office
Skill: Hadoop Admin Grade -C2/C1 Location: Pune/Chennai/Bangalore NP: Immediate to 15 Days Joiners Only Execute weekly server rebuilds (21 30 nodes) with zero data loss and minimal performance impact Perform Hadoop-level pre/post validations: cluster health, HDFS usage, replication, skew, and logs Coordinate with Data Center Ops and Unix Admins for hardware and OS-level tasks Reconfigure and reintegrate rebuilt nodes into the cluster Provide weekday and rotational weekend support across BDH1 and BDH4 clusters Required Skills: Strong hands-on experience with Hadoop ecosystem (HDFS, YARN, MapReduce, HBase) Proficient in log analysis, volume/block checks, and skew troubleshooting Familiarity with open-source Hadoop distributions and production change controls Excellent communication and cross-team coordination skills Ability to work independently in a fast-paced, complex environment Hadoop
Posted 2 weeks ago
5.0 - 8.0 years
5 - 8 Lacs
Bengaluru
Work from Office
Skills desired: Strong at SQL (Multi pyramid SQL joins) Python skills (FastAPI or flask framework) PySpark Commitment to work in overlapping hours GCP knowledge(BQ, DataProc and Dataflow) Amex experience is preferred(Not Mandatory) Power BI preferred (Not Mandatory) Flask, Pyspark, Python, Sql
Posted 2 weeks ago
4.0 - 9.0 years
10 - 20 Lacs
Coimbatore
Work from Office
Position Name: Data Engineer Location: Coimbatore (Hybrid 3 days per week) Work Shift Timing: 1.30 pm to 10.30 pm (IST) Mandatory Skills: SCALA, Spark, Python, Data bricks Good to have: Java & Hadoop The Role: Designing and building optimized data pipelines using cutting-edge technologies in a cloud environment to drive analytical insights. Constructing infrastructure for efficient ETL processes from various sources and storage systems. Leading the implementation of algorithms and prototypes to transform raw data into useful information. Architecting, designing, and maintaining database pipeline architectures, ensuring readiness for AI/ML transformations. Creating innovative data validation methods and data analysis tools. Ensuring compliance with data governance and security policies. Interpreting data trends and patterns to establish operational alerts. Developing analytical tools, programs, and reporting mechanisms. Conducting complex data analysis and presenting results effectively. Preparing data for prescriptive and predictive modeling. Continuously exploring opportunities to enhance data quality and reliability. Applying strong programming and problem-solving skills to develop scalable solutions. Requirements: Experience in the Big Data technologies (Hadoop, Spark, Nifi, Impala). Hands-on experience designing, building, deploying, testing, maintaining, monitoring, and owning scalable, resilient, and distributed data pipelines. High proficiency in Scala/Java and Spark for applied large-scale data processing. Expertise with big data technologies, including Spark, Data Lake, and Hive. Solid understanding of batch and streaming data processing techniques. Proficient knowledge of the Data Lifecycle Management process, including data collection, access, use, storage, transfer, and deletion. Expert-level ability to write complex, optimized SQL queries across extensive data volumes. Experience on HDFS, Nifi, Kafka. Experience on Apache Ozone, Delta Tables, Databricks, Axon(Kafka), Spring Batch, Oracle DB Familiarity with Agile methodologies. Obsession for service observability, instrumentation, monitoring, and alerting. Knowledge or experience in architectural best practices for building data lakes. Interested candidates share your resume at Neesha1@damcogroup.com along with the below mentioned details : Total Exp : Relevant Exp in Scala & Spark : Current CTC: Expected CTC: Notice period : Current Location:
Posted 2 weeks ago
8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
The Applications Development Senior Programmer Analyst is an intermediate level position responsible for participation in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to contribute to applications systems analysis and programming activities. Responsibilities: Conduct tasks related to feasibility studies, time and cost estimates, IT planning, risk technology, applications development, model development, and establish and implement new or revised applications systems and programs to meet specific business needs or user areas Monitor and control all phases of development process and analysis, design, construction, testing, and implementation as well as provide user and operational support on applications to business users Utilize in-depth specialty knowledge of applications development to analyze complex problems/issues, provide evaluation of business process, system process, and industry standards, and make evaluative judgement Recommend and develop security measures in post implementation analysis of business usage to ensure successful system design and functionality Consult with users/clients and other technology groups on issues, recommend advanced programming solutions, and install and assist customer exposure systems Ensure essential procedures are followed and help define operating standards and processes Serve as advisor or coach to new or lower level analysts Has the ability to operate with a limited level of direct supervision. Can exercise independence of judgement and autonomy. Acts as SME to senior stakeholders and /or other team members. Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Qualifications: 8 to 20 years of relevant experience Primary skills - Java / Scala + Spark Must have experience in - Hadoop/Java/Spark/Scala/Python Experience in systems analysis and programming of software applications Experience in managing and implementing successful projects Working knowledge of consulting/project management techniques/methods Ability to work under pressure and manage deadlines or unexpected changes in expectations or requirements Education: Bachelor’s degree/University degree or equivalent experience This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.
Posted 2 weeks ago
11.0 - 20.0 years
8 - 18 Lacs
Chennai, Bengaluru, PAN INDIA
Hybrid
Job Description Hiring for Big Data Lead Developer Experience Range: 5 to 18 years Mandatory Skills: Technology: Big Data Ecosystem (Hadoop, Spark, Kafka, Hive, HBase) Functional Programming: Scala Primary Skills: Technology Functional Programming Scala Big Data Data Processing: Apache Spark, Flink Data Storage: HDFS, Hive, Cassandra Data Streaming: Kafka, Storm Cloud Platforms: AWS, Azure, GCP DevOps Tools: Docker, Kubernetes, Jenkins Version Control: Git Education: BE/B.Tech/MCA/M.Tech/MSc./MSTS
Posted 2 weeks ago
175.0 years
0 Lacs
Bengaluru South, Karnataka, India
On-site
At American Express, our culture is built on a 175-year history of innovation, shared values and Leadership Behaviors, and an unwavering commitment to back our customers, communities, and colleagues. As part of Team Amex, you'll experience this powerful backing with comprehensive support for your holistic well-being and many opportunities to learn new skills, develop as a leader, and grow your career. Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express. This role is for Data Testing Analyst in the Regulatory Reporting automation program. This individual will be responsible for assisting the Business Specialist Manager drive the definition, gathering, exploration, and analysis of Finance data to deliver the end-to-end automation of our regulatory reporting platform. This individual will assist the organization coordinate with several groups within American Express during designing, implementing, and migrating the implemented solution into production. The individual selected will partner closely with Business Specialist Manager and Product Owners to support defining functionality to be built, collaborate with Technology to design how functionality will work and validate at regular intervals that the software features developed align with original requirements provided to the team. How will you make an impact in this role? Support data analysis on existing processes and datasets to understand and support Point of Arrival (POA) process design Support and guide determining portfolios, data elements and grain of data required for designing processes Support team review data scenarios and provide clarification on how to report on these scenarios in alignment with regulatory guidance Identify and support business requirements, functional design, prototyping, testing, training, and supporting implementations Support developing functional requirement documents (FRDs) and process specific design documentation to support process and report owner requirements Document and understand core components of solution architecture including data patterns, data-related capabilities, and standardization and conformance of disparate datasets Support the implementation of master and reference data to be used across operational and reporting processes Participate in daily meetings with the pods (implementation groups for various portfolios of the Company for data sourcing and regulatory classification and reporting). Coordinate with various Product Owners, Process Owners, Subject Matter Experts, Solution Architecture colleagues, and Data Management team to ensure builds are appropriate for American Express products Participate on user acceptance testing, parallel run testing, and any other testing required to ensure the build meets the requirements authored including development and execution of test cases, and documentation of results Assist on development of executable testing algorithms that enable validation of the expected system functionality, including replication of deterministic logic and filtering criteria Minimum Qualifications SQL and data analysis experience Product/platform understanding and process design experience Knowledgeable about Financial Data Warehouse and Reporting Solutions (such as ODS, AxiomSL, OFSAA, and Hadoop concepts) Knowledgeable in Data Analytics/profiling Knowledgeable with creating S2T and Functional designs Knowledgeable in creating Data Mappings, analyzing the SOR (System of Record) data, implementing Data Quality Rules to identify data issues in SORs Experience with of MS Excel and Power Query Testing management and execution experience Foundational data warehousing principles and data modeling experience is a plus Agile trained is a plus Financial reporting or accounting experience is a plus A good understanding of the banking products is a plus Exhibits organizational skills with the ability to meet/exceed critical deadlines and manage multiple deliverables simultaneously A self-starter, proactive team player with a passion to consistently deliver high quality service and exceed customers’ expectations Excellent written and verbal communications with ability to communicate highly complex concepts and processes in simple terms and pragmatically across Finance, Business and Technology stakeholders Excellent relationship building, presentation and collaboration skills Preferred Qualifications Knowledge of US Regulatory Reports (Y9C, Y14, Y15, 2052a, amongst others) Working exposure in data analysis and testing of financial data domains to support regulatory and analytical requirements for large scale banking/financial organizations Experience in development of testing automation capabilities Experience in Cloud capabilities is good to have We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations.
Posted 2 weeks ago
7.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description Building off our Cloud momentum, Oracle has formed a new organization - Health Data Intelligence. This team will focus on product development and product strategy for Oracle Health, while building out a complete platform supporting modernized, automated healthcare. This is a net new line of business, constructed with an entrepreneurial spirit that promotes an energetic and creative environment. We are unencumbered and will need your contribution to make it a world class engineering center with the focus on excellence. Oracle Health Data Analytics has a rare opportunity to play a critical role in how Oracle Health products impact and disrupt the healthcare industry by transforming how healthcare and technology intersect. Career Level - IC4 Responsibilities As a member of the software engineering division, you will take an active role in the definition and evolution of standard practices and procedures. Define specifications for significant new projects and specify, design and develop software according to those specifications. You will perform professional software development tasks associated with the developing, designing and debugging of software applications or operating systems. Design and build distributed, scalable, and fault-tolerant software systems. Build cloud services on top of the modern OCI infrastructure. Participate in the entire software lifecycle, from design to development, to quality assurance, and to production. Invest in the best engineering and operational practices upfront to ensure our software quality bar is high. Optimize data processing pipelines for orders of magnitude higher throughput and faster latencies. Leverage a plethora of internal tooling at OCI to develop, build, deploy, and troubleshoot software. Qualifications 7+ years of experience in the software industry working on design, development and delivery of highly scalable products and services. Understanding of the entire product development lifecycle that includes understanding and refining the technical specifications, HLD and LLD of world-class products and services, refining the architecture by providing feedback and suggestions, developing, and reviewing code, driving DevOps, managing releases and operations. Strong knowledge of Java or JVM based languages. Experience with multi-threading and parallel processing. Strong knowledge of big data technologies like Spark, Hadoop Map Reduce, Crunch, etc. Past experience of building scalable, performant, and secure services/modules. Understanding of Micro Services architecture and API design Experience with Container platforms Good understanding of testing methodologies. Experience with CI/CD technologies. Experience with observability tools like Spunk, New Relic, etc Good understanding of versioning tools like Git/SVN. Qualifications Career Level - IC4 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
40175 Jobs | Dublin
Wipro
19626 Jobs | Bengaluru
Accenture in India
17497 Jobs | Dublin 2
EY
16057 Jobs | London
Uplers
11768 Jobs | Ahmedabad
Amazon
10704 Jobs | Seattle,WA
Oracle
9513 Jobs | Redwood City
IBM
9439 Jobs | Armonk
Bajaj Finserv
9311 Jobs |
Accenture services Pvt Ltd
8745 Jobs |