Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Join us as a Senior Software Engineer at Barclays, where you'll take part in the evolution of our digital landscape, driving innovation and excellence. You'll harness cutting-edge technology to revolutionize our digital offerings, ensuring unparalleled customer experiences. As a part of the team, you will deliver technology stack, using strong analytical and problem solving skills to understand the business requirements and deliver quality solutions. You'll be working on complex technical problems that will involve detailed analytical skills and analysis. This will be done in conjunction with fellow engineers, business analysts and business stakeholders. To be successful as a Senior Software Engineer you should have experience with: Strong proficiency in Scala, especially functional programming paradigms. Hands-on experience with Apache Spark (RDDs, DataFrames, Datasets). Expertise with Spark batch processing. Knowledge of Big Data ecosystems: Hadoop, Hive, Imapala, Kafka. Experience with data serialization formats like Parquet, AVRO. Understanding of performance tuning in Spark (e.g., partitioning, caching, shuffling). Proficiency in SQL and data modeling. Familiarity with CI/CD tools, version control (Git), and containerization (Docker/Kubernetes). Familiarity with AWS toolset is added advantage . You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills. This role is based in Pune. Purpose of the role To build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure. Accountabilities Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete and consistent data. Design and implementation of data warehoused and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures. Development of processing and analysis algorithms fit for the intended data complexity and volumes. Collaboration with data scientist to build and deploy machine learning models. Analyst Expectations To perform prescribed activities in a timely manner and to a high standard consistently driving continuous improvement. Requires in-depth technical knowledge and experience in their assigned area of expertise Thorough understanding of the underlying principles and concepts within the area of expertise They lead and supervise a team, guiding and supporting professional development, allocating work requirements and coordinating team resources. If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L – Listen and be authentic, E – Energise and inspire, A – Align across the enterprise, D – Develop others. OR for an individual contributor, they develop technical expertise in work area, acting as an advisor where appropriate. Will have an impact on the work of related teams within the area. Partner with other functions and business areas. Takes responsibility for end results of a team’s operational processing and activities. Escalate breaches of policies / procedure appropriately. Take responsibility for embedding new policies/ procedures adopted due to risk mitigation. Advise and influence decision making within own area of expertise. Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to. Deliver your work and areas of responsibility in line with relevant rules, regulation and codes of conduct. Maintain and continually build an understanding of how own sub-function integrates with function, alongside knowledge of the organisations products, services and processes within the function. Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Make evaluative judgements based on the analysis of factual information, paying attention to detail. Resolve problems by identifying and selecting solutions through the application of acquired technical experience and will be guided by precedents. Guide and persuade team members and communicate complex / sensitive information. Act as contact point for stakeholders outside of the immediate function, while building a network of contacts outside team and external to the organisation. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave.
Posted 19 hours ago
4.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
The Applications Development Intermediate Programmer Analyst position at our organization involves participating in the establishment and implementation of new or revised application systems and programs in collaboration with the Technology team. Your primary goal in this role will be to contribute to application systems analysis and programming activities. You will be responsible for hands-on experience in ETL and Big Data Testing, delivering high-quality solutions, proficient in Database & UI Testing using Automation tools, and knowledgeable in Performance, Volume & Stress testing. A strong understanding of SDLC / STLC processes, different types of manual Testing, and Agile methodology will be essential. You will be skilled in designing and executing test cases, authoring user stories, defect tracking, and aligning with business requirements. Being open to learning and implementing new innovations in automation processes according to project needs will be crucial. Your role will also involve managing complex tasks and teams, fostering a collaborative, growth-oriented environment through strong technical and analytical skills. You will utilize your knowledge of applications development procedures, concepts, and other technical areas to identify and define necessary system enhancements, including using script tools and analyzing/interpreting code. Familiarity with Test Management Tool JIRA and Automation Tools like Python, PySpark, Java, Spark, MySQL, Selenium, and Tosca is required, with experience in Hadoop / ABINTIO being a plus. Consulting with users, clients, and other technology groups on issues, and recommending programming solutions, installing, and supporting customer exposure systems will also be part of your responsibilities. Qualifications: - 4-8 years of relevant experience in the Financial Service industry - Intermediate level experience in Applications Development role - Clear and concise written and verbal communication skills - Demonstrated problem-solving and decision-making abilities - Ability to work under pressure, manage deadlines, and adapt to unexpected changes in expectations or requirements Education: - Bachelors degree/University degree or equivalent experience Please note that this job description provides a high-level overview of the work performed, and additional job-related duties may be assigned as required.,
Posted 19 hours ago
4.0 - 10.0 years
0 Lacs
thiruvananthapuram, kerala
On-site
At EY, you will have the opportunity to shape a career as unique as you are, supported by a global network, inclusive culture, and cutting-edge technology to help you reach your full potential. Your individual perspective and voice are valued to contribute to the continuous improvement of EY. By joining us, you can create an outstanding experience for yourself while contributing to a more efficient and inclusive working world for all. As a Data Engineering Lead, you will work closely with the Data Architect to design and implement scalable data lake architecture and data pipelines. Your responsibilities will include designing and implementing scalable data lake architectures using Azure Data Lake services, developing and maintaining data pipelines for data ingestion from various sources, optimizing data storage and retrieval processes for efficiency and performance, ensuring data security and compliance with industry standards, collaborating with data scientists and analysts to enhance data accessibility, monitoring and troubleshooting data pipeline issues to ensure reliability, and documenting data lake designs, processes, and best practices. You should have experience with SQL and NoSQL databases, as well as familiarity with big data file formats such as Parquet and Avro. **Roles and Responsibilities:** **Must Have Skills:** - Azure Data Lake - Azure Synapse Analytics - Azure Data Factory - Azure DataBricks - Python (PySpark, Numpy, etc.) - SQL - ETL - Data warehousing - Azure DevOps - Experience in developing streaming pipelines using Azure Event Hub, Azure Stream Analytics, Spark streaming - Experience in integrating with business intelligence tools such as Power BI **Good To Have Skills:** - Big Data technologies (e.g., Hadoop, Spark) - Data security **General Skills:** - Experience with Agile and DevOps methodologies and the software development lifecycle - Proactive and accountable for deliverables - Ability to identify and escalate dependencies and risks - Proficient in working with DevOps tools with limited supervision - Timely completion of assigned tasks and regular status reporting - Capability to train new team members - Desired knowledge of cloud solutions like Azure or AWS with DevOps/Cloud certifications - Ability to work effectively with multicultural global teams and virtually - Strong relationship-building skills with project stakeholders Join EY in its mission to build a better working world by creating long-term value for clients, people, and society, and fostering trust in the capital markets. Leveraging data and technology, diverse EY teams across 150+ countries provide assurance and support clients in growth, transformation, and operations across various sectors. Through its services in assurance, consulting, law, strategy, tax, and transactions, EY teams strive to address complex global challenges by asking insightful questions to discover innovative solutions.,
Posted 20 hours ago
8.0 - 12.0 years
0 Lacs
hyderabad, telangana
On-site
As a Full Stack Data Engineer Lead Analyst at Evernorth, you will be a key player in the Data & Analytics Engineering organization of Cigna, a leading Health Services company. Your role will involve delivering business needs by understanding requirements and deploying software into production. To excel in this position, you should be well-versed in critical technologies, eager to learn, and committed to adding value to the business. Ownership, a thirst for knowledge, and an open mindset are essential attributes for a successful Full Stack Engineer like yourself. In addition to delivery responsibilities, you will be expected to embrace an automation-first and continuous improvement mindset. You will drive the adoption of CI/CD tools and support the enhancement of toolsets and processes. Your ability to articulate clear business objectives aligned with technical specifications and work in an iterative, agile manner will be crucial. Taking ownership and being accountable, writing referenceable and modular code, and ensuring data quality are key behaviors expected from you. Key Characteristics: - Independently design and architect solutions - Demonstrate ownership and accountability - Write referenceable and modular code - Possess fluency in specific areas and proficiency in multiple areas - Exhibit a passion for continuous learning - Maintain a quality mindset to ensure data quality and business impact assessment Required Skills: - Experience in developing data integration and ingestion strategies, including Snowflake cloud data warehouse, AWS S3 buckets, and loading nested JSON formatted data - Strong understanding of snowflake cloud database architecture - Proficiency in big data technologies like Databricks, Hadoop, HiveQL, Spark (Scala/Python) and cloud technologies such as AWS (S3, Glue, Terraform, Lambda, Aurora, Redshift, EMR) - Experience in working on Analytical Models and enabling their deployment and production via data and analytical pipelines - Expertise in Query Tuning and Performance improvements - Previous exposure to onsite/offshore setup or model Required Experience & Education: - 8+ years of professional industry experience - Bachelor's degree (or equivalent) - 5+ years of Python scripting experience - 5+ years of Data Management and SQL expertise in Teradata & Snowflake - 3+ years of Agile team experience, preferably with Scrum Desired Experience: - Familiarity with version management tools, with Git being preferred - Exposure to BDD and TDD development methodologies - Experience in an agile CI/CD environment; Jenkins experience is preferred - Knowledge of Health care information domains is advantageous Location & Hours of Work: - (Specify whether the position is remote, hybrid, in-office, and where the role is located as well as the required hours of work) Evernorth is committed to being an Equal Opportunity Employer, actively promoting and supporting diversity, equity, and inclusion efforts throughout the organization. Staff are encouraged to participate in these initiatives to enhance internal practices and external collaborations with diverse client populations.,
Posted 20 hours ago
5.0 - 9.0 years
0 Lacs
maharashtra
On-site
Are you looking for an exciting opportunity to solve large-scale business problems using Generative AI Join our dynamic team to tackle challenges using Generative AI in the Wholesale Credit Risk Quantitative Research Applied AI/ML team. You will play a crucial role in developing innovative AI solutions that leverage the firm's extensive data resources. Your primary focus will be on creating tools based on Large Language Models (LLMs) to enhance the End-to-End credit risk process across Wholesale. This role presents an exciting opportunity to innovate and have a significant impact on credit risk management. If you are passionate about AI and enthusiastic about working on cutting-edge solutions, we encourage you to apply. Your responsibilities will include: - Developing and implementing AI solutions to address business challenges. - Collaborating with cross-functional teams to translate requirements into technical solutions. - Formulating risk strategies to enhance risk monitoring using diverse data sources. - Managing the full lifecycle from Proof of Concept to production-ready solutions, including stakeholder presentations and post-implementation monitoring. - Ensuring the performance and reliability of deployed solutions. - Staying informed on the latest AI/ML advancements. - Leading the development and rapid deployment of AI solutions influenced by macro-economic factors and current events. Qualifications, Skills, and Experience required: - Advanced degree in Data Science, Computer Science, Engineering, Mathematics, or Statistics. - Minimum of 5 years of experience in applied AI/ML. - Strong understanding and practical experience with Machine Learning; expertise in LLM/NLP is highly preferred. - Proficiency in modern analytic and data tools, especially Python/Anaconda, TensorFlow, Keras/PyTorch, Spark, and SQL. Cloud experience is a plus. - Experience in model implementation and production deployment is preferred. If you possess excellent problem-solving, communication, and leadership skills, and are eager to contribute to the advancement of AI solutions in credit risk management, we look forward to reviewing your application.,
Posted 20 hours ago
2.0 - 6.0 years
0 Lacs
hyderabad, telangana
On-site
As a member of the Infosys consulting team, you will play a crucial role in addressing customer issues, identifying problem areas, devising creative solutions, and overseeing their implementation to ensure customer satisfaction. Your responsibilities will include developing proposals, contributing to solution design, configuring products, conducting pilot sessions, and resolving queries related to requirements and solution design. You will be involved in conducting solution demonstrations, workshops, and preparing effort estimates aligned with customer budgetary constraints and organizational financial guidelines. Leading small projects and participating in unit-level and organizational initiatives will be part of your role to deliver high-quality and valuable solutions to clients as they embark on their digital transformation journey. Your skill set should include the ability to devise innovative strategies that drive client innovation, growth, and profitability, along with a good understanding of software configuration management systems and awareness of the latest technologies and industry trends. Logical thinking, problem-solving abilities, collaboration skills, and a grasp of financial processes and pricing models for projects are essential for success in this role. Additionally, you should have expertise in assessing current processes, identifying areas for improvement, and proposing technology solutions. Possessing knowledge in one or more industry domains, client interfacing skills, and experience in project and team management will be beneficial in excelling in this position at Infosys.,
Posted 20 hours ago
3.0 - 7.0 years
0 Lacs
navi mumbai, maharashtra
On-site
As a ClickHouse Database Specialist, you will be responsible for helping build production-grade systems based on ClickHouse. This includes advising on how to design schemas, plan clusters, and work on infrastructure projects related to ClickHouse. You will be working on diverse environments ranging from single node setups to clusters with hundreds of nodes. Additionally, you will be involved in improving ClickHouse itself by fixing bugs, enhancing documentation, creating test cases, and studying new usage patterns, functions, and integration with other products. Your role will also entail the installation, configuration, backup, recovery, and maintenance of multiple node clusters in ClickHouse database. Monitoring and optimizing database performance to ensure high availability and responsiveness will be a key aspect of your responsibilities. Troubleshooting database issues, identifying and resolving performance bottlenecks, designing and implementing backup and recovery strategies, and developing database security policies and procedures will be part of your daily tasks. Collaborating with development teams to optimize database schema design and queries, providing technical guidance and support to development and operations teams, and handling support calls from customers using ClickHouse will be crucial components of this role. Furthermore, having experience with big data stack components like Hadoop, Spark, Kafka, Nifi, as well as data science and data analysis, will be beneficial. Knowledge of SRE/DevOps stacks, monitoring, and system management tools such as Prometheus, Ansible, ELK, and version control using git are also desired skills for this position. In summary, as a ClickHouse Database Specialist, you will play a vital role in ensuring the efficient operation and optimization of ClickHouse databases, contributing to the overall success of production-grade systems and infrastructure projects.,
Posted 21 hours ago
2.0 - 6.0 years
0 Lacs
ahmedabad, gujarat
On-site
You will be responsible for designing, developing, and maintaining scalable data pipelines using Azure Databricks. Your role will involve building and optimizing ETL/ELT processes for structured and unstructured data, collaborating with data scientists, analysts, and business stakeholders, integrating Databricks with Azure Data Lake, Synapse, Data Factory, and Blob Storage, developing real-time data streaming pipelines, and managing data models/data warehouses. Additionally, you will optimize performance, manage resources, ensure cost efficiency, implement best practices for data governance, security, and quality, troubleshoot and improve existing data workflows, contribute to architecture and technology strategy, mentor junior team members, and maintain documentation. To excel in this role, you should have a Bachelor's/Master's degree in Computer Science, IT, or a related field, along with 5+ years of Data Engineering experience (minimum 2+ years with Databricks). Strong expertise in Azure cloud services (Data Lake, Synapse, Data Factory), proficiency in Spark (PySpark/Scala) and big data processing, experience with Delta Lake, Structured Streaming, and real-time pipelines, strong SQL skills, an understanding of data modeling and warehousing, familiarity with DevOps tools like CI/CD, Git, Terraform, Azure DevOps, excellent problem-solving and communication skills are essential. Preferred qualifications include Databricks Certified (Associate/Professional), experience with machine learning workflows on Databricks, knowledge of data governance tools like Purview, experience with REST APIs, Kafka, Event Hubs, cloud performance tuning, and cost optimization experience. Join us to be a part of a supportive and collaborative team, work with a growing company in the exciting BI and Data industry, enjoy a competitive salary and performance-based bonuses, and have opportunities for professional growth and development. If you are interested in this opportunity, please send your resume to hr@exillar.com and fill out the form at https://forms.office.com/r/HdzMNTaagw.,
Posted 21 hours ago
10.0 - 15.0 years
0 Lacs
delhi
On-site
As a seasoned data engineering professional with 10+ years of experience, you will lead and mentor a team of data engineers to ensure high performance and career growth. Your primary responsibility will be to architect and optimize scalable data infrastructure, guaranteeing high availability and reliability. Additionally, you will drive the development and implementation of data governance frameworks and best practices, collaborating closely with cross-functional teams to define and execute a data roadmap. Your expertise in backend development using languages like Java, PHP, Python, Node.JS, GoLang, JavaScript, HTML, and CSS will be crucial. Proficiency in SQL, Python, and Scala for data processing and analytics is a must. In-depth knowledge of cloud platforms such as AWS, GCP, or Azure is required, along with hands-on experience in big data technologies like Spark, Hadoop, Kafka, and distributed computing frameworks. You will be responsible for ensuring data security, compliance, and quality across all data platforms while optimizing data processing workflows for performance and cost efficiency. A strong foundation in High-Level Design (HLD) and Low-Level Design (LLD), as well as design patterns, preferably using Spring Boot or Google Guice, is necessary. Experience with data warehousing solutions like Snowflake, Redshift, or BigQuery will be beneficial. Your role will also involve working with NoSQL databases such as Redis, Cassandra, MongoDB, and TiDB, as well as familiarity with automation and DevOps tools like Jenkins, Ansible, Docker, Kubernetes, Chef, Grafana, and ELK. Proven ability to drive technical strategy aligned with business objectives, strong leadership, communication, and stakeholder management skills are essential for this position. Candidates from Tier 1 colleges/universities with a background in product startups and experience in implementing Data Engineering systems from an early stage in the company are preferred. Additionally, experience in machine learning infrastructure or MLOps, exposure to real-time data processing and analytics, and interest in data structures, algorithm analysis and design, multicore programming, and scalable architecture will be advantageous. Prior experience in a SaaS or high-growth tech company will be a plus. If you are a highly skilled data engineer with a passion for innovation and technical excellence, we invite you to apply for this challenging and rewarding opportunity.,
Posted 21 hours ago
5.0 years
0 Lacs
Haryana, India
On-site
Job Description About TaskUs: TaskUs is a provider of outsourced digital services and next-generation customer experience to fast-growing technology companies, helping its clients represent, protect and grow their brands. Leveraging a cloud-based infrastructure, TaskUs serves clients in the fastest-growing sectors, including social media, e-commerce, gaming, streaming media, food delivery, ride-sharing, HiTech, FinTech, and HealthTech. The People First culture at TaskUs has enabled the company to expand its workforce to approximately 45,000 employees globally. Presently, we have a presence in twenty-three locations across twelve countries, which include the Philippines, India, and the United States. It started with one ridiculously good idea to create a different breed of Business Processing Outsourcing (BPO)! We at TaskUs understand that achieving growth for our partners requires a culture of constant motion, exploring new technologies, being ready to handle any challenge at a moment's notice, and mastering consistency in an ever-changing world. What We Offer: At TaskUs, we prioritize our employees' well-being by offering competitive industry salaries and comprehensive benefits packages. Our commitment to a People First culture is reflected in the various departments we have established, including Total Rewards, Wellness, HR, and Diversity. We take pride in our inclusive environment and positive impact on the community. Moreover, we actively encourage internal mobility and professional growth at all stages of an employee's career within TaskUs. Join our team today and experience firsthand our dedication to supporting People First. Job Description Summary Data Scientist with deep expertise in modern AI/ML technologies to join our innovative team. This role combines cutting-edge research in machine learning, deep learning, and generative AI with practical full-stack cloud development skills. You will be responsible for architecting and implementing end-to-end AI solutions, from data engineering pipelines to production-ready applications leveraging the latest in agentic AI and large language models. Job Description Key Responsibilities AI/ML Development & Research Design, develop, and deploy advanced machine learning and deep learning models for complex business problems Implement and optimize Large Language Models (LLMs) and Generative AI solutions Build agentic AI systems with autonomous decision-making capabilities Conduct research on emerging AI technologies and their practical applications Perform model evaluation, validation, and continuous improvement Cloud Infrastructure & Full-Stack Development Architect and implement scalable cloud-native ML/AI solutions on AWS, Azure, or GCP Develop full-stack applications integrating AI models with modern web technologies Build and maintain ML pipelines using cloud services (SageMaker, ML Engine, etc.) Implement CI/CD pipelines for ML model deployment and monitoring Design and optimize cloud infrastructure for high-performance computing workloads Data Engineering & Database Management Design and implement data pipelines for large-scale data processing Work with both SQL and NoSQL databases (PostgreSQL, MongoDB, Cassandra, etc.) Optimize database performance for ML workloads and real-time applications Implement data governance and quality assurance frameworks Handle streaming data processing and real-time analytics Leadership & Collaboration Mentor junior data scientists and guide technical decision-making Collaborate with cross-functional teams including product, engineering, and business stakeholders Present findings and recommendations to technical and non-technical audiences Lead proof-of-concept projects and innovation initiatives Required Qualifications Education & Experience Master's or PhD in Computer Science, Data Science, Statistics, Mathematics, or related field 5+ years of hands-on experience in data science and machine learning 3+ years of experience with deep learning frameworks and neural networks 2+ years of experience with cloud platforms and full-stack development Technical Skills - Core AI/ML Machine Learning: Scikit-learn, XGBoost, LightGBM, advanced ML algorithms Deep Learning: TensorFlow, PyTorch, Keras, CNN, RNN, LSTM, Transformers Large Language Models: GPT, BERT, T5, fine-tuning, prompt engineering Generative AI: Stable Diffusion, DALL-E, text-to-image, text generation Agentic AI: Multi-agent systems, reinforcement learning, autonomous agents Technical Skills - Development & Infrastructure Programming: Python (expert), R, Java/Scala, JavaScript/TypeScript Cloud Platforms: AWS (SageMaker, EC2, S3, Lambda), Azure ML, or Google Cloud AI Databases: SQL (PostgreSQL, MySQL), NoSQL (MongoDB, Cassandra, DynamoDB) Full-Stack Development: React/Vue.js, Node.js, FastAPI, Flask, Docker, Kubernetes MLOps: MLflow, Kubeflow, Model versioning, A/B testing frameworks Big Data: Spark, Hadoop, Kafka, streaming data processing Preferred Qualifications Experience with vector databases and embeddings (Pinecone, Weaviate, Chroma) Knowledge of LangChain, LlamaIndex, or similar LLM frameworks Experience with model compression and edge deployment Familiarity with distributed computing and parallel processing Experience with computer vision and NLP applications Knowledge of federated learning and privacy-preserving ML Experience with quantum machine learning Expertise in MLOps and production ML system design Key Competencies Technical Excellence Strong mathematical foundation in statistics, linear algebra, and optimization Ability to implement algorithms from research papers Experience with model interpretability and explainable AI Knowledge of ethical AI and bias detection/mitigation Problem-Solving & Innovation Strong analytical and critical thinking skills Ability to translate business requirements into technical solutions Creative approach to solving complex, ambiguous problems Experience with rapid prototyping and experimentation Communication & Leadership Excellent written and verbal communication skills Ability to explain complex technical concepts to diverse audiences Strong project management and organizational skills Experience mentoring and leading technical teams How We Partner To Protect You: TaskUs will neither solicit money from you during your application process nor require any form of payment in order to proceed with your application. Kindly ensure that you are always in communication with only authorized recruiters of TaskUs. DEI: In TaskUs we believe that innovation and higher performance are brought by people from all walks of life. We welcome applicants of different backgrounds, demographics, and circumstances. Inclusive and equitable practices are our responsibility as a business. TaskUs is committed to providing equal access to opportunities. If you need reasonable accommodations in any part of the hiring process, please let us know. We invite you to explore all TaskUs career opportunities and apply through the provided URL https://www.taskus.com/careers/ . TaskUs is proud to be an equal opportunity workplace and is an affirmative action employer. We celebrate and support diversity; we are committed to creating an inclusive environment for all employees. TaskUs people first culture thrives on it for the benefit of our employees, our clients, our services, and our community. Req Id: R_2507_10290_0 Posted At: Thu Jul 31 2025 00:00:00 GMT+0000 (Coordinated Universal Time)
Posted 21 hours ago
2.0 - 6.0 years
0 Lacs
hyderabad, telangana
On-site
About Marriott: Marriott Tech Accelerator is part of Marriott International, a global leader in hospitality. Marriott International, Inc. is a leading American multinational company that operates a vast array of lodging brands, including hotels and residential properties. It consists of over 30 well-known brands and nearly 8,900 properties situated in 141 countries and territories. Role Title: Security Data Scientist Position Summary: Marriott International's Global Information Security is seeking an experienced Security Data Scientist who can combine expertise in cybersecurity with data science skills to analyze and protect Marriott's digital assets. Job Responsibilities: Perform data cleaning, analysis, and modeling tasks. Work under guidance of senior team members to: Analyze large datasets related to cybersecurity threats and incidents. Implement existing machine learning models and algorithms to detect anomalies and potential security breaches. Support SDL tools (e.g., big data, ML/AI technologies). Create data visualizations and reports to communicate insights to stakeholders. Collaborate with cybersecurity teams to implement data-driven security solutions. Stay up to date with the latest cyber threats and data science techniques. Help to maintain and document SDL MLOps processes and procedures. Skill and Experience: 2-4 years of data science, data analytics, data management, and/or information security experience that includes: 2+ years of experience in data science/data analytics in an enterprise environment. 1+ years of experience in information protection/information security. Strong background in statistics, mathematics, and software engineering (e.g., Proficiency in Python, R). Experience with machine learning algorithms and frameworks as well as AI techniques. Knowledge of cybersecurity principles, tools, and best practices. Familiarity with cloud platforms (AWS, Azure, GCP) and big data technologies. Understanding of data visualization tools like Power BI. Preferred: Programming languages: Python, R, SQL. Machine learning frameworks: TensorFlow, PyTorch, scikit-learn. Big data technologies: Hadoop, Spark, and Kafka. Cloud platforms: AWS, Azure, GCP. Data visualization tools: Tableau, Power BI. Relevant certifications such as data science certifications, CISSP, CEH. Verbal and written communication skills. Education and Certifications: Bachelor's degree in computer/data science, information management, Cybersecurity, or related field or equivalent experience/certification. Work location: Hyderabad, India. Work mode: Hybrid.,
Posted 21 hours ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
The role of warehousing and logistics systems is becoming increasingly crucial in enhancing the competitiveness of various companies and contributing to the overall efficiency of the global economy. Modern intra-logistics solutions integrate cutting-edge mechatronics, sophisticated software, advanced robotics, computational perception, and AI algorithms to ensure high throughput and streamlined processing for critical commercial logistics functions. Our Warehouse Execution Software is designed to optimize intralogistics and warehouse automation by utilizing advanced optimization techniques. By synchronizing discrete logistics processes, we have created a real-time decision engine that maximizes labor and equipment efficiency. Our software empowers customers with operational agility essential for meeting the demands of an Omni-channel environment. We are seeking a dynamic individual who can develop state-of-the-art MLOps and DevOps frameworks for AI model deployment. The ideal candidate should possess expertise in cloud technologies, deployment architectures, and software production standards. Moreover, effective collaboration within interdisciplinary teams is key to successfully guiding products through the development cycle. **Core Job Responsibilities:** - Develop comprehensive pipelines covering the ML lifecycle from data ingestion to model evaluation. - Collaborate with AI scientists to expedite the operationalization of ML algorithms. - Establish CI/CD/CT pipelines for ML algorithms. - Implement model deployment both in cloud and on-premises edge environments. - Lead a team of DevOps/MLOps engineers. - Stay updated on new tools, technologies, and industry best practices. **Key Qualifications:** - Master's degree in Computer Science, Software Engineering, or a related field. - Proficiency in Cloud Platforms, particularly GCP, and relevant skills like Docker, Kubernetes, and edge computing. - Familiarity with task orchestration tools such as MLflow, Kubeflow, Airflow, Vertex AI, and Azure ML. - Strong programming skills, preferably in Python. - Robust DevOps expertise including Linux/Unix, testing, automation, Git, and build tools. - Knowledge of data engineering tools like Beam, Spark, Pandas, SQL, and GCP Dataflow is advantageous. - Minimum 5 years of experience in relevant fields, including academic exposure. - At least 3 years of experience in managing a DevOps/MLOps team.,
Posted 22 hours ago
3.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
As a Data Architect, you will be responsible for leading, analyzing, designing, and delivering analytics solutions and applications, including statistical data models, reports, and dashboards in cloud environments such as AWS, Azure, and GCP, as well as corresponding cloud-based EDW database platforms like Snowflake, Redshift, and BigQuery. You must have a minimum of 8 years of experience, with at least 3 years in the role of a data architect for Data Warehouse and Analytics solutions. Your role will involve leveraging your 3+ years of experience with cloud platforms (AWS, Azure, GCP) and a strong understanding of the ingestion and consumption processes in Data Lakes. You should also have 3+ years of experience in cloud-based EDW platforms such as Snowflake, Redshift, BigQuery, or Synapse, and be adept at building and launching new data models that provide intuitive analytics for analysts and customers. In this position, you will be expected to work with and analyze large datasets within the relevant domains of enterprise data, as well as demonstrate strong experience in Data Warehouse ETL design and development, methodologies, tools, processes, and best practices. Proficiency in writing complex SQL, PL/SQL, UNIX scripts, and understanding of performance tuning and troubleshooting aspects are also crucial aspects of this role. Furthermore, you should possess good communication and presentation skills, with a proven track record of using insights to influence executives and colleagues. Additionally, having awareness or expertise in data security, data access controls, DevOps tools, and development frameworks like SCRUM/Agile will be beneficial. Your responsibilities will also include recommending solutions to improve cloud and existing Datawarehouse solutions, as well as showcasing the new capabilities of advanced analytics to business and technology teams to demonstrate the potential of the Data platform. Overall, your leadership abilities will be essential in driving cross-functional development on new solutions from design through delivery. (ref: hirist.tech),
Posted 22 hours ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
This is a data engineer position - a programmer responsible for the design, development implementation and maintenance of data flow channels and data processing systems that support the collection, storage, batch and real-time processing, and analysis of information in a scalable, repeatable, and secure manner in coordination with the Data & Analytics team. The overall objective is defining optimal solutions to data collection, processing, and warehousing. Must be a Spark Java development expertise in big data processing, Python and Apache spark particularly within banking & finance domain. He/She designs, codes and tests data systems and works on implementing those into the internal infrastructure. Responsibilities: Ensuring high quality software development, with complete documentation and traceability Develop and optimize scalable Spark Java-based data pipelines for processing and analyzing large scale financial data Design and implement distributed computing solutions for risk modeling, pricing and regulatory compliance Ensure efficient data storage and retrieval using Big Data Implement best practices for spark performance tuning including partition, caching and memory management Maintain high code quality through testing, CI/CD pipelines and version control (Git, Jenkins) Work on batch processing frameworks for Market risk analytics Promoting unit/functional testing and code inspection processes Work with business stakeholders and Business Analysts to understand the requirements Work with other data scientists to understand and interpret complex datasets Qualifications: 5- 8 Years of experience in working in data eco systems. 4-5 years of hands-on experience in Hadoop, Scala, Java, Spark, Hive, Kafka, Impala, Unix Scripting and other Big data frameworks. 3+ years of experience with relational SQL and NoSQL databases: Oracle, MongoDB, HBase Strong proficiency in Python and Spark Java with knowledge of core spark concepts (RDDs, Dataframes, Spark Streaming, etc) and Scala and SQL Data Integration, Migration & Large Scale ETL experience (Common ETL platforms such as PySpark/DataStage/AbInitio etc.) - ETL design & build, handling, reconciliation and normalization Data Modeling experience (OLAP, OLTP, Logical/Physical Modeling, Normalization, knowledge on performance tuning) Experienced in working with large and multiple datasets and data warehouses Experience building and optimizing ‘big data’ data pipelines, architectures, and datasets. Strong analytic skills and experience working with unstructured datasets Ability to effectively use complex analytical, interpretive, and problem-solving techniques Experience with Confluent Kafka, Redhat JBPM, CI/CD build pipelines and toolchain – Git, BitBucket, Jira Experience with external cloud platform such as OpenShift, AWS & GCP Experience with container technologies (Docker, Pivotal Cloud Foundry) and supporting frameworks (Kubernetes, OpenShift, Mesos) Experienced in integrating search solution with middleware & distributed messaging - Kafka Highly effective interpersonal and communication skills with tech/non-tech stakeholders. Experienced in software development life cycle and good problem-solving skills. Excellent problem-solving skills and strong mathematical and analytical mindset Ability to work in a fast-paced financial environment Education: Bachelor’s/University degree or equivalent experience in computer science, engineering, or similar domain ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Data Architecture ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.
Posted 22 hours ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : PySpark Good to have skills : Python (Programming Language), Apache Spark, Google BigQuery Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will also participate in testing and debugging processes to ensure the applications function as intended, while continuously seeking ways to enhance application efficiency and user experience. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Assist in the documentation of application specifications and user guides. - Engage in code reviews to ensure quality and adherence to best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark. - Good To Have Skills: Experience with Apache Spark, Python (Programming Language), Google BigQuery. - Strong understanding of data processing frameworks and their applications. - Experience in developing scalable applications using distributed computing. - Familiarity with cloud platforms and their integration with application development. Additional Information: - The candidate should have minimum 3 years of experience in PySpark. - This position is based at our Chennai office. - A 15 years full time education is required., 15 years full time education
Posted 22 hours ago
3.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : PySpark Good to have skills : Apache Spark Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will also participate in testing and debugging processes to ensure the applications function as intended, while continuously seeking ways to enhance application efficiency and user experience. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Assist in the documentation of application specifications and user guides. - Engage in code reviews to ensure adherence to best practices and standards. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark. - Good To Have Skills: Experience with Apache Spark. - Strong understanding of data processing and transformation techniques. - Familiarity with application development frameworks and methodologies. - Experience in debugging and troubleshooting application issues. Additional Information: - The candidate should have minimum 3 years of experience in PySpark. - This position is based at our Pune office. - A 15 years full time education is required., 15 years full time education
Posted 22 hours ago
3.0 years
0 Lacs
Greater Kolkata Area
On-site
Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Apache Spark Good to have skills : Java, Scala, PySpark Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the data architecture. You will be involved in analyzing requirements, proposing solutions, and ensuring that the data platform aligns with organizational goals and standards. Your role will require you to stay updated with industry trends and best practices to contribute effectively to the team. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Engage in continuous learning to stay abreast of emerging technologies and methodologies. - Collaborate with cross-functional teams to gather requirements and translate them into technical specifications. Professional & Technical Skills: - Must To Have Skills: Proficiency in Apache Spark. - Good To Have Skills: Experience with Java, Scala, PySpark. - Strong understanding of data processing frameworks and distributed computing. - Experience with data integration tools and techniques. - Familiarity with cloud platforms and services related to data engineering. Additional Information: - The candidate should have minimum 3 years of experience in Apache Spark. - This position is based at our Kolkata office. - A 15 years full time education is required., 15 years full time education
Posted 23 hours ago
2.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
We are looking for sales professional for our client who work in EV battery segment. Responsibilities: Lead Generation & Prospecting: Identify and qualify leads through cold outreach, networking, and market intelligence. Be the spark that starts our customer journey. Relationship Building: Develop deep connections with decision-makers in fleet, logistics, and mobility companies. Understand their pain points and become their trusted vehicle & energy advisor. Consultative Selling: Pitch our cutting-edge energy stack with clarity and impact. Tailor solutions that fit customer needs and demonstrate ROI. Deal Closure: Negotiate commercial terms, handle objections, and close high-value deals — all while delivering value at every step. Customer Success & Growth: Nurture existing accounts, unlock upsell opportunities, and build long-term relationships. Market Insights: Stay ahead of trends in EVs, energy, and logistics — be the go-to person for competitive intelligence. Reporting & Analysis: Use data to improve — from pipeline health to deal velocity, share insights that drive better outcomes. Qualifications: Bachelor's degree in a relevant field, such as engineering, business etc, MBA is a plus. At least 2 years of proven experience in B2B sales, ideally within the commercial vehicles sector, last mile logistics or energy sector. Excellent communication, presentation, and interpersonal skills. Strong analytical and problem-solving abilities. Self-motivated and results-oriented with a strong work ethic. Understanding of logistics, electric vehicles (advantageous). Proficiency in Excel and powerpoint What matters: Quality of work Approach towards problem-solving Dissatisfaction towards mediocre work Resilient attitude to bounce back after failing
Posted 23 hours ago
2.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Description At Boeing, we innovate and collaborate to make the world a better place. We’re committed to fostering an environment for every teammate that’s welcoming, respectful and inclusive, with great opportunity for professional growth. Find your future with us. Overview As a leading global aerospace company, Boeing develops, manufactures and services commercial airplanes, defense products and space systems for customers in more than 150 countries. As a top U.S. exporter, the company leverages the talents of a global supplier base to advance economic opportunity, sustainability and community impact. Boeing’s team is committed to innovating for the future, leading with sustainability, and cultivating a culture based on the company’s core values of safety, quality and integrity. Technology for today and tomorrow The Boeing India Engineering & Technology Center (BIETC) is a 5500+ engineering workforce that contributes to global aerospace growth. Our engineers deliver cutting-edge R&D, innovation, and high-quality engineering work in global markets, and leverage new-age technologies such as AI/ML, IIoT, Cloud, Model-Based Engineering, and Additive Manufacturing, shaping the future of aerospace. People-driven culture At Boeing, we believe creativity and innovation thrives when every employee is trusted, empowered, and has the flexibility to choose, grow, learn, and explore. We offer variable arrangements depending upon business and customer needs, and professional pursuits that offer greater flexibility in the way our people work. We also believe that collaboration, frequent team engagements, and face-to-face meetings bring together different perspectives and thoughts – enabling every voice to be heard and every perspective to be respected. No matter where or how our teammates work, we are committed to positively shaping people’s careers and being thoughtful about employee wellbeing. With us, you can create and contribute to what matters most in your career, community, country, and world. Join us in powering the progress of global aerospace. Boeing India IT Product Systems team is currently looking for an Associate Software Developer - Java full stack to join them in their team in Bangalore, India. This role will be based out of Bangalore, India . Position Responsibilities: Understands and develops software solutions to meet end user requirements. Ensures that application integrates with overall system architecture, utilizing standard IT lifecycle methodologies and tools. Develops algorithms, data and process models, plans interfaces and writes interface control documents for use in construction of solutions of moderate complexity. Employer will not sponsor applicants for employment visa status. Basic Qualifications (Required Skills/Experience): 2+ years of relevant experience in IT industry Experience in designing and implementing idiomatic RESTful APIs using the Spring framework (v6.0+) with Spring Boot (v3.0+) and Spring Security (v6.0+) in Java (v17+). Experience with additional languages (Scala/Kotlin/others) preferred. Working experience with RDBM Systems, basic SQL scripting and querying, specifically with SQL Server (2018+) and Teradata (v17+). Additional knowledge of schema / modelling / querying optimization preferred. Experience with Typescript (v5+), JavaScript (ES6+), Angular (v15+), Material UI, AmCharts (v5+) Experience working with ALM tools (Git, Gradle, SonarQube, Coverity, Docker, Kubernetes) driven by tests (JUnit, Mockito, Hamcrest etc.) Experience in shell scripting (Bash/Sh), CI/CD processes and tools (GitLab CI/similar) OCI containers (Docker/Podman/Buildah etc.) Data analysis and engineering experience with Apache Spark (v3+) in Scala, Apache Iceberg / Parquet etc. Experience with Trino/Presto is a bonus. Familiarity with GCP / Azure (VMs, container runtimes, BLOB storage solutions) preferred but not mandatory. Preferred Qualifications (Desired Skills/Experience) : A Bachelor’s degree or higher is preferred Strong backend experience (Java/Scala/Kotlin etc.) with basic data analysis/engineering experience (Spark/Parquet etc.) OR basic backend experience (Java/Scala etc.) with strong data analysis/engineering experience (Spark/Parquet etc.) OR Moderate backend experience (Java/Kotlin etc.) with Strong Frontend experience (Angular 15+ with SASS / Angular Material) and exposure to DevOps pipelines (GitLab CI) Typical Education & Experience: Bachelor's Degree with typically 2 to 5 years of experience OR Master's Degree with typically 1 to 2 years of experience is preferred but not required Relocation: This position does offer relocation within INDIA. Applications for this position will be accepted until Aug. 09, 2025 Export Control Requirements: This is not an Export Control position. Relocation This position offers relocation based on candidate eligibility. Visa Sponsorship Employer will not sponsor applicants for employment visa status. Shift Not a Shift Worker (India) Equal Opportunity Employer: We are an equal opportunity employer. We do not accept unlawful discrimination in our recruitment or employment practices on any grounds including but not limited to; race, color, ethnicity, religion, national origin, gender, sexual orientation, gender identity, age, physical or mental disability, genetic factors, military and veteran status, or other characteristics covered by applicable law. We have teams in more than 65 countries, and each person plays a role in helping us become one of the world’s most innovative, diverse and inclusive companies. We are proud members of the Valuable 500 and welcome applications from candidates with disabilities. Applicants are encouraged to share with our recruitment team any accommodations required during the recruitment process. Accommodations may include but are not limited to: conducting interviews in accessible locations that accommodate mobility needs, encouraging candidates to bring and use any existing assistive technology such as screen readers and offering flexible interview formats such as virtual or phone interviews.
Posted 23 hours ago
3.0 - 20.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Change Management and Transformation Consultant – Capital Markets Find endless opportunities to solve our clients' toughest challenges, as you work with exceptional people, the latest tech and leading companies across industries. Practice: Capital Markets, Industry Consulting, Capability Network I Areas of Work: Change Management and Transformation | Level: 11/9/7 / 6/5 | Location: Bengaluru/Gurugram/Mumbai| Years of Exp: 3-20 years Explore an Exciting Career at Accenture Are you an outcome-oriented problem solver? Do you enjoy working on transformation strategies for global clients? Does working in an inclusive and collaborative environment spark your interest? Then, Accenture Strategy and Consulting is the right place for you to explore limitless possibilities. The Practice – A Brief Sketch As a part of the Capital Markets practices within Accenture’s Capability Network, you will work with our global teams to help investment banks, asset and wealth managers, and exchanges, prepare for the digital future. Together, let’s leverage global strategies and data-driven insights to pave way for digital-enabled capital markets. Help us unlock new value in a disruptive world, with the following initiatives: Collaborate with client challenges to solve complex client problems such as regulatory reforms and implementation. Define and manage the organization change with reference to process, technology and organization structure. Manage transformation project to migrate from legacy to target. Assess as-is process and suggest best industry practices to come up with to-be processes and implement them to remove inefficiencies. Support data governance and management and help optimize operations and drive business decision-making. Support in development of collateral, methodology refinements, best practices updates and trends tracking, create and support proposals incorporating Accenture value proposition. Incorporate Accenture best practices and help develop methodologies into every stage of the project management lifecycle. Bring your best skills forward to excel in the role: Good analytical and problem-solving skills Excellent communication, interpersonal and presentation skills Cross-cultural competence with an ability to thrive in a dynamic consulting environment Read More About Us. Recent Blogs
Posted 23 hours ago
6.0 years
0 Lacs
Pune, Maharashtra, India
On-site
The Applications Development Technology Lead Analyst is a senior level position responsible for establishing and implementing new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to lead applications systems analysis and programming activities. Responsibilities: Partner with multiple management teams to ensure appropriate integration of functions to meet goals as well as identify and define necessary system enhancements to deploy new products and process improvements Resolve variety of high impact problems/projects through in-depth evaluation of complex business processes, system processes, and industry standards Provide expertise in area and advanced knowledge of applications programming and ensure application design adheres to the overall architecture blueprint Utilize advanced knowledge of system flow and develop standards for coding, testing, debugging, and implementation Develop comprehensive knowledge of how areas of business, such as architecture and infrastructure, integrate to accomplish business goals Provide in-depth analysis with interpretive thinking to define issues and develop innovative solutions Serve as advisor or coach to mid-level developers and analysts, allocating work as necessary Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Qualifications: 6-10 years of relevant experience in Apps Development or systems analysis role Extensive experience system analysis and in programming of software applications Experience in managing and implementing successful projects Subject Matter Expert (SME) in at least one area of Applications Development Ability to adjust priorities quickly as circumstances dictate Demonstrated leadership and project management skills Consistently demonstrates clear and concise written and verbal communication Education: Bachelor’s degree/University degree or equivalent experience Master’s degree preferred This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. Role Summary: RHOO-RegHub On Olympus is a Regulatory Reporting framework built on Olympus Tech Stack to centrally report all in-scope transactions, events and client reports on a single, scalable, cost-effective regulatory architecture that mitigates regulatory and reputational risk through delivery of complete, accurate and compliant reporting for various businesses. We are looking for a Big Data and AI Lead responsible for driving our organization's efforts in leveraging big data and artificial intelligence (AI) to achieve Regulatory objectives. This role involves developing and implementing strategies for data collection, storage, processing, analysis, and AI-driven applications. By embracing AI/ML, RegHub On Olympus (RHOO) will remain at the forefront of regulatory reporting technology and provide a competitive advantage in the financial industry. Required Skillset : The role requires 10+ yrs hands on experience in Java development. Working experience in designing systems with Low Latency Streaming Architectures Spark Spring Boot Kafka (ELK) Flink or any real-time streaming framework experience. Working Knowledge of NoSQL, Kafka, Big Data, MongoDB Hands-on experience on Hadoop, Big Data, Spark SQL, Hive/Impala. Experience in delivering Regulatory ask like in extremely compressed timelines Experience is JMS and Real time message processing on TIBCO. Experience using Jira, Bit Bucket and managing development/testing/release efforts. Experience using XML, FIX, POJO, JSON message format. Experience with Impala, Elastic & Oracle. Working knowledge on AWS-ECS- Exp on Code Build/Code Pipeline for CI/CD & CloudWatch for logging/Monitoring Hands on Amazon Simple Storage Service (S3). Added advantage of having hands-on experience with AI/ML technologies, including Python, Predictive Modeling, Natural Language Processing, Machine Learning Algorithms, and general data structure modules. Good to have knowledge on Order and Executions and Trade Life Cycle Events & knowledge to work with message formats like FPML and ISO20022. Ability to lead a team from the front and guide them through time sensitive milestones. Focus on leveraging regulatory delivery as a driver for the firm’s data strategy. Very good communication and interpersonal skills. Excellent relationships with senior tech, business and compliance partners. Experience with delivery expertise for large scale programs. Ability to align delivery with firm’s long-term strategy. Quick decision maker with the ability to take a grasp of the situation in case of an issue and mitigate the impact. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.
Posted 23 hours ago
13.0 years
0 Lacs
Pune, Maharashtra, India
On-site
The Applications Development Technology Lead Analyst is a senior level position responsible for establishing and implementing new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to lead applications systems analysis and programming activities. Responsibilities: Partner with multiple management teams to ensure appropriate integration of functions to meet goals as well as identify and define necessary system enhancements to deploy new products and process improvements Resolve variety of high impact problems/projects through in-depth evaluation of complex business processes, system processes, and industry standards Provide expertise in area and advanced knowledge of applications programming and ensure application design adheres to the overall architecture blueprint Utilize advanced knowledge of system flow and develop standards for coding, testing, debugging, and implementation Develop comprehensive knowledge of how areas of business, such as architecture and infrastructure, integrate to accomplish business goals Provide in-depth analysis with interpretive thinking to define issues and develop innovative solutions Serve as advisor or coach to mid-level developers and analysts, allocating work as necessary Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Qualifications: 13+ years of relevant experience in Apps Development or systems analysis role Extensive experience system analysis and in programming of software applications Experience in managing and implementing successful projects Subject Matter Expert (SME) in at least one area of Applications Development Ability to adjust priorities quickly as circumstances dictate Demonstrated leadership and project management skills Consistently demonstrates clear and concise written and verbal communication Working experience in designing systems with Low Latency Streaming Architectures Working Knowledge of NoSQL, Kafka, Big Data, MongoDB Hands-on experience on Hadoop, Big Data, Spark SQL, Hive/Impala. Experience in delivering Regulatory ask like in extremely compressed timelines Experience is JMS and Real time message processing on TIBCO. Experience using Jira, Bit Bucket and managing development/testing/release efforts. Experience using XML, FIX, POJO, JSON message format. Experience with Impala, Elastic & Oracle. Working knowledge on AWS-ECS- Exp on Code Build/Code Pipeline for CI/CD & CloudWatch for logging/Monitoring Hands on Amazon Simple Storage Service (S3). Added advantage of having hands-on experience with AI/ML technologies, including Python, Predictive Modeling, Natural Language Processing, Machine Learning Algorithms, and general data structure modules. Good to have knowledge on Order and Executions and Trade Life Cycle Events & knowledge to work with message formats like FPML and ISO20022. Ability to lead a team from the front and guide them through time sensitive milestones. Focus on leveraging regulatory delivery as a driver for the firm’s data strategy. Education: Bachelor’s degree/University degree or equivalent experience This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.
Posted 23 hours ago
8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
The Data Analytics Lead Analyst is a strategic professional who stays abreast of developments within own field and contributes to directional strategy by considering their application in own job and the business. Recognized technical authority for an area within the business. Requires basic commercial awareness. There are typically multiple people within the business that provide the same level of subject matter expertise. Developed communication and diplomacy skills are required in order to guide, influence and convince others, in particular colleagues in other areas and occasional external customers. Significant impact on the area through complex deliverables. Provides advice and counsel related to the technology or operations of the business. Work impacts an entire area, which eventually affects the overall performance and effectiveness of the sub-function/job family. Responsibilities: Deep hands-on experience with PySpark for data processing, ETL (Extract, Transform, Load) operations, data manipulation, and building distributed computing solutions on large datasets. Proficiency in designing and building robust data pipelines, data ingestion, transformation, and processing workflows Solid understanding of data modeling principles, database design, and strong SQL skills for data querying and analysis. Ability to analyze data, identify patterns, uncover insights, and translate business needs into actionable data solutions. Leading and mentoring a team of data engineers or analysts, fostering best practices, and ensuring the delivery of high-quality data products. Working closely with product partners and business analysts, to understand requirements and deliver impactful analytical solutions. Qualifications: To be successful in this role, you should meet the following requirements: 8+ years of experience in handling distributed / big data projects. Proficiency in Pyspark, Linux scripting, SQL and Bigdata tools. Technology stack – Pyspark, ETL, Unix Shell Scripting, Python, Spark, SQL, Impala, Hive Strong exposure in interpretation of business requirements from a technical perspective. Design, develop and implement IT solutions that fulfill business users' requirements and conform to a high level of quality standard. Sound problem-solving skills and attention to detail. Strong communication, presentation and team collaboration skills. Knowledge of Automation and DevOps practices. Familiarity with agile development methodologies using Jira Education: Bachelor’s/University degree or equivalent experience, potentially Masters degree This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Data Analytics ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.
Posted 23 hours ago
8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
A.P. Moller - Maersk A.P. Moller – Maersk is the global leader in container shipping services. The business operates in 130 countries and employs 80,000 staff. An integrated container logistics company, Maersk aims to connect and simplify its customers’ supply chains. Today, we have more than 180 nationalities represented in our workforce across 131 Countries and this mean, we have elevated level of responsibility to continue to build inclusive workforce that is truly representative of our customers and their customers and our vendor partners too. We are responsible for moving 20 % of global trade & is on a mission to become the Global Integrator of Container Logistics. To achieve this, we are transforming into an industrial digital giant by combining our assets across air, land, ocean, and ports with our growing portfolio of digital assets to connect and simplify our customer’s supply chain through global end-to-end solutions, all the while rethinking the way we engage with customers and partners. The Brief As a Senior AI/ML Engineer in our Data & AI Governance team, you’ll build the systems that improve how Maersk detects, manages, and fixes data quality issues at scale while contributing to responsible AI observability and compliance tooling. This is a hands-on engineering role focused on platform-level tooling for data reliability, model traceability, and metadata intelligence. You’ll work across structured and unstructured data, help enforce quality SLAs and contribute to components that support the governance of AI/ML models. The role sits at the intersection of platform engineering, data operations, and applied AI - ideal for someone who enjoys building reusable tools, mentoring others, and making complex systems more reliable and auditable. This is a key part of our long-term vision to treat data quality with the same urgency and rigor as platform reliability. The systems you build will help set a new standard for how we manage quality, fairness, and trust in enterprise data and AI. Senior AI/ML Engineer Maersk is committed to a diverse and inclusive workplace, and we embrace different styles of thinking. Maersk is an equal opportunities employer and welcomes applicants without regard to race, colour, gender, sex, age, religion, creed, national origin, ancestry, citizenship, marital status, sexual orientation, physical or mental disability, medical condition, pregnancy or parental leave, veteran status, gender identity, genetic information, or any other characteristic protected by applicable law. We will consider qualified applicants with criminal histories in a manner consistent with all legal requirements. We are happy to support your need for any adjustments during the application and hiring process. If you need special assistance or an accommodation to use our website, apply for a position, or to perform a job, please contact us by emailing accommodationrequests@maersk.com. What I'll be doing – your accountabilities? Build and scale AI/ML-driven components to detect data anomalies, schema drift, and degradation in real-time across pipelines Develop validation logic, auto-profiling tools, and scoring engines to assess and monitor enterprise data quality Design architecture for AI/ ML-based DQ solutions that are modular, reusable, and scalable Apply AI/ML techniques including NLP, rule induction, and pattern classification to enrich metadata and detect systemic quality issues Build tooling to support responsible AI: drift tracking, fairness detection, explainability indicators, and lifecycle logging Partner with platform engineers to integrate these tools into orchestration systems (e.g., Airflow, MLflow, or Dagster) Work with data owners and stewards to operationalize quality ownership using MIDAS – Maersk’s enterprise AI platform for metadata inventory, data accountability, and governance Contribute to the creation of a DataOps playbook with SLAs, page-zero metrics, escalation routines, and ownership models Mentor junior engineers and shape architectural and engineering best practices for AI/ML observability and data quality tooling Foundational Skills Expert-level Python engineering experience with a proven ability to ship AI/ML-backed tooling at production scale Advanced knowledge of data pipelines and orchestration frameworks (e.g., Airflow, Spark, Dagster) Expert understanding of system observability - logging, telemetry, health scoring applied to data and model workflows Proven track record of applying advanced AI/ML techniques (e.g., classification, clustering, anomaly detection) in production settings Strong grounding in solution architecture for data-intensive, distributed systems Specialized Skills Deep experience applying AI/ML to data quality use cases such as profiling, anomaly detection, drift analysis, and schema inference Expertise in metadata management, lineage tracing, and automated documentation (e.g., via DataHub, Unity Catalog, or Collibra) Hands-on experience with responsible AI tooling (e.g., SHAP, LIME, Fairlearn, What-If Tool) for explainability and bias detection Built or contributed to platform-level components that are used across domains, not just in isolated project delivery Ability to design and implement architectural patterns that support federated ownership, reuse, and lifecycle transparency Eagerness to learn and contribute to AI governance frameworks (e.g., EU AI Act, ISO 42001, NIST AI RMF) and translate those into engineering patterns Qualifications & Requirements 8+ years of engineering experience, including at least 3 years building and deploying AI/ML solutions in production Demonstrated experience building DQ and model observability tools - not just core predictive systems Strong experience working in cross-functional platform teams that deliver shared services used across business units Fluent in MLOps tooling (e.g., MLflow, SageMaker, Vertex AI) and capable of versioning, tracking, and documenting model behavior Strong communication and documentation skills; able to make complex system behavior understandable and operable Passion for enabling trustworthy AI through high-quality engineering practices Preferred Experiences In addition to basic qualifications, would be great if you have… Experience implementing data quality scoring, monitoring, or root cause tooling in a production environment Experience working with shared metadata systems and operationalizing lineage or traceability at scale Strong involvement in platform teams or developer enablement functions - not just analytics or research delivery Applied experience with model explainability, fairness evaluation, or lifecycle documentation tooling Understanding of enterprise AI risk and how to translate policy into engineering design constraints
Posted 23 hours ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Description Some careers shine brighter than others. If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Consultant Specialist In this role, you will: Software design, Scala & Spark development, automated testing of new and existing components in an Agile, DevOps and dynamic environment Promoting development standards, code reviews, mentoring, knowledge sharing Production support & troubleshooting. Implement the tools and processes, handling performance, scale, availability, accuracy and monitoring Liaison with BAs to ensure that requirements are correctly interpreted and implemented. Participation in regular planning and status meetings. Input to the development process – through the involvement in Sprint reviews and retrospectives. Input into system architecture and design. Peer code reviews Requirements To be successful in this role, you should meet the following requirements: Scala development and design using Scala 2.10+ or Java development and design using Java 1.8+. Experience with most of the following technologies (Apache Hadoop, Scala, Apache Spark, Spark streaming, YARN, Kafka, Hive, Python, ETL frameworks, Map Reduce, SQL, RESTful services) Sound knowledge on working Unix/Linux Platform Hands-on experience building data pipelines using Hadoop components - Hive, Spark, Spark SQL. Experience with industry standard version control tools (Git, GitHub), automated deployment tools (Ansible & Jenkins) and requirement management in JIRA. Understanding of big data modelling techniques using relational and non-relational techniques Experience on Debugging the Code issues and then publishing the highlighted differences to the development team/Architects; Experience with time-series/analytics dB’s such as Elastic search. Experience with scheduling tools such as Airflow, Control-M. Understanding or experience of Cloud design patterns Exposure to DevOps & Agile Project methodology such as Scrum and Kanban. Experience with developing Hive QL, UDF’s for analysing semi structured/structured datasets Location : Pune and Bangalore You’ll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by – HSDI
Posted 1 day ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough