Jobs
Interviews

5335 Hive Jobs - Page 5

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 years

0 Lacs

Kanayannur, Kerala, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics D&A – SSIS- Senior We’re looking for Informatica or SSIS Engineers with Cloud Background (AWS, Azure) Primary skills: Has played key roles in multiple large global transformation programs on business process management Experience in database query using SQL Should have experience working on building/integrating data into a data warehouse. Experience in data profiling and reconciliation Informatica PowerCenter/IBM-DataStage/ SSIS development Strong proficiency in SQL/PLSQL Good experience in performance tuning ETL workflows and suggest improvements. Developed expertise in complex data management or Application integration solution and deployment in areas of data migration, data integration, application integration or data quality. Experience in data processing, orchestration, parallelization, transformations and ETL Fundamentals. Leverages on variety of programming languages & data crawling/processing tools to ensure data reliability, quality & efficiency (optional) Experience in Cloud Data-related tool (Microsoft Azure, Amazon S3 or Data lake) Knowledge on Cloud infrastructure and knowledge on Talend cloud is an added advantage Knowledge of data modelling principles. Knowledge in Autosys scheduling Good experience in database technologies. Good knowledge in Unix system Responsibilities: Need to work as a team member to contribute in various technical streams of Data integration projects. Provide product and design level technical best practices Interface and communicate with the onsite coordinators Completion of assigned tasks on time and regular status reporting to the lead Building a quality culture Use an issue-based approach to deliver growth, market and portfolio strategy engagements for corporates Strong communication, presentation and team building skills and experience in producing high quality reports, papers, and presentations. Experience in executing and managing research and analysis of companies and markets, preferably from a commercial due diligence standpoint. Qualification: BE/BTech/MCA (must) with an industry experience of 3 -7 years. Experience in Talend jobs, joblets and customer components. Should have knowledge of error handling and performance tuning in Talend. Experience in big data technologies such as sqoop, Impala, hive, Yarn, Spark etc. Informatica PowerCenter/IBM-DataStage/ SSIS development Strong proficiency in SQL/PLSQL Good experience in performance tuning ETL workflows and suggest improvements. Atleast experience of minimum 3-4 clients for short duration projects ranging between 6-8 + months OR Experience of minimum 2+ clients for duration of projects ranging between 1-2 years or more than that People with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 2 days ago

Apply

4.0 years

0 Lacs

Kochi, Kerala, India

On-site

Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and Azure Cloud Data Platform Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark and Hive, Hbase or other NoSQL databases on Azure Cloud Data Platform or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / Azure eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc. Preferred Education Master's Degree Required Technical And Professional Expertise Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala; Minimum 3 years of experience on Cloud Data Platforms on Azure; Experience in DataBricks / Azure HDInsight / Azure Data Factory, Synapse, SQL Server DB Good to excellent SQL skills Preferred Technical And Professional Experience Certification in Azure and Data Bricks or Cloudera Spark Certified developers Experience in DataBricks / Azure HDInsight / Azure Data Factory, Synapse, SQL Server DB Knowledge or experience of Snowflake will be an added advantage

Posted 2 days ago

Apply

10.0 years

0 Lacs

Kanayannur, Kerala, India

On-site

At EY, we’re all in to shape your future with confidence. We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. EY GDS – Data and Analytics (D&A) – Cloud Architect - Manager As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for SeniorManagers (GTM +Cloud/ Big Data Architects) with strong technology and data understanding having proven delivery capability in delivery and pre sales. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Have proven experience in driving Analytics GTM/Pre-Sales by collaborating with senior stakeholder/s in the client and partner organization in BCM, WAM, Insurance. Activities will include pipeline building, RFP responses, creating new solutions and offerings, conducting workshops as well as managing in flight projects focused on cloud and big data. Need to work with client in converting business problems/challenges to technical solutions considering security, performance, scalability etc. [ 10- 15 years] Need to understand current & Future state enterprise architecture. Need to contribute in various technical streams during implementation of the project. Provide product and design level technical best practices Interact with senior client technology leaders, understand their business goals, create, architect, propose, develop and deliver technology solutions Define and develop client specific best practices around data management within a Hadoop environment or cloud environment Recommend design alternatives for data ingestion, processing and provisioning layers Design and develop data ingestion programs to process large data sets in Batch mode using HIVE, Pig and Sqoop, Spark Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Skills And Attributes For Success Architect in designing highly scalable solutions Azure, AWS and GCP. Strong understanding & familiarity with all Azure/AWS/GCP /Bigdata Ecosystem components Strong understanding of underlying Azure/AWS/GCP Architectural concepts and distributed computing paradigms Hands-on programming experience in Apache Spark using Python/Scala and Spark Streaming Hands on experience with major components like cloud ETLs,Spark, Databricks Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Knowledge of Spark and Kafka integration with multiple Spark jobs to consume messages from multiple Kafka partitions Solid understanding of ETL methodologies in a multi-tiered stack, integrating with Big Data systems like Cloudera and Databricks. Strong understanding of underlying Hadoop Architectural concepts and distributed computing paradigms Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Good knowledge in apache Kafka & Apache Flume Experience in Enterprise grade solution implementations. Experience in performance bench marking enterprise applications Experience in Data security [on the move, at rest] Strong UNIX operating system concepts and shell scripting knowledge To qualify for the role, you must have Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support Responsible for the evaluation of technical risks and map out mitigation strategies Experience in Data security [on the move, at rest] Experience in performance bench marking enterprise applications Working knowledge in any of the cloud platform, AWS or Azure or GCP Excellent business communication, Consulting, Quality process skills Excellent Consulting Skills Excellence in leading Solution Architecture, Design, Build and Execute for leading clients in Banking, Wealth Asset Management, or Insurance domain. Minimum 7 years hand-on experience in one or more of the above areas. Minimum 10 years industry experience Ideally, you’ll also have Strong project management skills Client management skills Solutioning skills What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets. Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow. EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.

Posted 2 days ago

Apply

10.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

At EY, we’re all in to shape your future with confidence. We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. EY GDS – Data and Analytics (D&A) – Cloud Architect - Manager As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for SeniorManagers (GTM +Cloud/ Big Data Architects) with strong technology and data understanding having proven delivery capability in delivery and pre sales. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Have proven experience in driving Analytics GTM/Pre-Sales by collaborating with senior stakeholder/s in the client and partner organization in BCM, WAM, Insurance. Activities will include pipeline building, RFP responses, creating new solutions and offerings, conducting workshops as well as managing in flight projects focused on cloud and big data. Need to work with client in converting business problems/challenges to technical solutions considering security, performance, scalability etc. [ 10- 15 years] Need to understand current & Future state enterprise architecture. Need to contribute in various technical streams during implementation of the project. Provide product and design level technical best practices Interact with senior client technology leaders, understand their business goals, create, architect, propose, develop and deliver technology solutions Define and develop client specific best practices around data management within a Hadoop environment or cloud environment Recommend design alternatives for data ingestion, processing and provisioning layers Design and develop data ingestion programs to process large data sets in Batch mode using HIVE, Pig and Sqoop, Spark Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Skills And Attributes For Success Architect in designing highly scalable solutions Azure, AWS and GCP. Strong understanding & familiarity with all Azure/AWS/GCP /Bigdata Ecosystem components Strong understanding of underlying Azure/AWS/GCP Architectural concepts and distributed computing paradigms Hands-on programming experience in Apache Spark using Python/Scala and Spark Streaming Hands on experience with major components like cloud ETLs,Spark, Databricks Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Knowledge of Spark and Kafka integration with multiple Spark jobs to consume messages from multiple Kafka partitions Solid understanding of ETL methodologies in a multi-tiered stack, integrating with Big Data systems like Cloudera and Databricks. Strong understanding of underlying Hadoop Architectural concepts and distributed computing paradigms Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Good knowledge in apache Kafka & Apache Flume Experience in Enterprise grade solution implementations. Experience in performance bench marking enterprise applications Experience in Data security [on the move, at rest] Strong UNIX operating system concepts and shell scripting knowledge To qualify for the role, you must have Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support Responsible for the evaluation of technical risks and map out mitigation strategies Experience in Data security [on the move, at rest] Experience in performance bench marking enterprise applications Working knowledge in any of the cloud platform, AWS or Azure or GCP Excellent business communication, Consulting, Quality process skills Excellent Consulting Skills Excellence in leading Solution Architecture, Design, Build and Execute for leading clients in Banking, Wealth Asset Management, or Insurance domain. Minimum 7 years hand-on experience in one or more of the above areas. Minimum 10 years industry experience Ideally, you’ll also have Strong project management skills Client management skills Solutioning skills What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets. Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow. EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.

Posted 2 days ago

Apply

0.0 - 15.0 years

83 - 104 Lacs

Delhi, Delhi

On-site

Job Title: Data Architect (Leadership Role) Company : Wingify Location : Delhi (Outstation Candidates Allowed) Experience Required : 10 – 15 years Working Days : 5 days/week Budget : 83 Lakh to 1.04 Cr About Us We are a fast-growing product-based tech company known for its flagship product VWO—a widely adopted A/B testing platform used by over 4,000 businesses globally, including Target, Disney, Sears, and Tinkoff Bank. The team is self-organizing, highly creative, and passionate about data, tech, and continuous innovation. About us Company Size: Mid-Sized Industry : Consumer Internet, Technology, Consulting Role & Responsibilities Lead and mentor a team of Data Engineers, ensuring performance and career development. Architect scalable and reliable data infrastructure with high availability. Define and implement data governance frameworks, compliance, and best practices. Collaborate cross-functionally to execute the organization’s data roadmap. Optimize data processing workflows for scalability and cost efficiency. Ensure data quality, privacy, and security across platforms. Drive innovation and technical excellence across the data engineering function. Ideal Candidate Must-Haves Experience : 10+ years in software/data engineering roles. At least 2–3+ years in a leadership role managing teams of 5+ Data Engineers. Proven hands-on experience setting up data engineering systems from scratch (0 → 1 stage) in high-growth B2B product companies. Technical Expertise: Strong in Java (preferred), or Python, Node.js, GoLang. Expertise in big data tools: Apache Spark, Kafka, Hadoop, Hive, Airflow, Presto, HDFS. Strong design experience in High-Level Design (HLD) and Low-Level Design (LLD). Backend frameworks like Spring Boot, Google Guice. Cloud data platforms: AWS, GCP, Azure. Familiarity with data warehousing: Snowflake, Redshift, BigQuery. Databases: Redis, Cassandra, MongoDB, TiDB. DevOps tools: Jenkins, Docker, Kubernetes, Ansible, Chef, Grafana, ELK. Other Skills: Strong understanding of data governance, security, and compliance (GDPR, SOC2, etc.). Proven strategic thinking with ability to align technical architecture to business objectives. Excellent communication, leadership, and stakeholder management. Preferred Qualifications Exposure to Machine Learning infrastructure / MLOps. Experience with real-time data analytics. Strong foundation in algorithms, data structures, and scalable systems. Previous work in SaaS or high-growth startups. Screening Questions Do you have team leadership experience? How many engineers have you led? Have you built a data engineering platform from scratch? Describe the setup. What’s the largest data scale you’ve worked with and where? Are you open to continuing hands-on coding in this role? Interested candidates applies on deepak.visko@gmail.com or 9238142824 . Job Types: Full-time, Permanent Pay: ₹8,300,000.00 - ₹10,400,000.00 per year Work Location: In person

Posted 2 days ago

Apply

12.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

To get the best candidate experience, please consider applying for a maximum of 3 roles within 12 months to ensure you are not duplicating efforts. Job Category Software Engineering Job Details About Salesforce Salesforce is the #1 AI CRM, where humans with agents drive customer success together. Here, ambition meets action. Tech meets trust. And innovation isn’t a buzzword — it’s a way of life. The world of work as we know it is changing and we're looking for Trailblazers who are passionate about bettering business and the world through AI, driving innovation, and keeping Salesforce's core values at the heart of it all. Ready to level-up your career at the company leading workforce transformation in the agentic era? You’re in the right place! Agentforce is the future of AI, and you are the future of Salesforce. As an engineering leader, you will focus on developing the team around you. Bring your technical chops to drive your teams to success around feature delivery and live-site management for a complex cloud infrastructure service. You are as enthusiastic about recruiting and building a team as you are about challenging technical problems that your team will solve. You will also help shape, direct and execute our product vision. You’ll be challenged to blend customer-centric principles, industry-changing innovation, and the reliable delivery of new technologies. You will work directly with engineering, product, and design, to create experiences that reinforce the Salesforce brand by delighting and wowing our customers with highly reliable and available services. Responsibilities Drive the vision of enabling a full suite of Salesforce applications on Google Cloud in collaboration with teams across geographies. Build and lead a team of engineers to deliver cloud framweoks, infrastructure automation tools, workflows, and validation platforms on our public cloud platforms. Solid experience in building and evolving large scale distributed systems to reliably process billions of data points Proactively identify reliability & data quality problems and drive triaging and remediation process. Invest in continuous employee development of a highly technical team by mentoring and coaching engineers and technical leads in the team. Recruit and attract top talent. Drive execution and delivery by collaborating with cross functional teams, architects, product owners and engineers. Experience managing 2+ engineering teams. Experience building services on public cloud platforms like GCP, AWS, Azure Required Skills/Experiences B.S/M.S. in Computer Sciences or equivalent field. 12+ years of relevant experience in software development teams with 5+ years of experience managing teams Passionate, curious, creative, self-starter and approach problems with right methodology and intelligent decisions. Laser focus on impact, balancing effort to value, and getting things done. Experience providing mentorship, technical leadership, and guidance to team members. Strong customer service orientation and a desire to help others succeed. Top notch written and oral communication skills. Desired Skills/Experiences Working knowledge of modern technologies/services on public cloud is desirable Experience with container orchestration systems Kubernetes, Docker, Helios, Fleet Expertise in open source technologies like Elastic Search, Logstash, Kakfa, MongoDB, Hadoop, Spark, Trino/Presto, Hive, Airflow, Splunk Benefits & Perks Comprehensive benefits package including well-being reimbursement, generous parental leave, adoption assistance, fertility benefits, and more! World-class enablement and on-demand training with Trailhead.com Exposure to executive thought leaders and regular 1:1 coaching with leadership Volunteer opportunities and participation in our 1:1:1 model for giving back to the community For more details, visit https://www.salesforcebenefits.com/ Unleash Your Potential When you join Salesforce, you’ll be limitless in all areas of your life. Our benefits and resources support you to find balance and be your best , and our AI agents accelerate your impact so you can do your best . Together, we’ll bring the power of Agentforce to organizations of all sizes and deliver amazing experiences that customers love. Apply today to not only shape the future — but to redefine what’s possible — for yourself, for AI, and the world. Accommodations If you require assistance due to a disability applying for open positions please submit a request via this Accommodations Request Form. Posting Statement Salesforce is an equal opportunity employer and maintains a policy of non-discrimination with all employees and applicants for employment. What does that mean exactly? It means that at Salesforce, we believe in equality for all. And we believe we can lead the path to equality in part by creating a workplace that’s inclusive, and free from discrimination. Know your rights: workplace discrimination is illegal. Any employee or potential employee will be assessed on the basis of merit, competence and qualifications – without regard to race, religion, color, national origin, sex, sexual orientation, gender expression or identity, transgender status, age, disability, veteran or marital status, political viewpoint, or other classifications protected by law. This policy applies to current and prospective employees, no matter where they are in their Salesforce employment journey. It also applies to recruiting, hiring, job assignment, compensation, promotion, benefits, training, assessment of job performance, discipline, termination, and everything in between. Recruiting, hiring, and promotion decisions at Salesforce are fair and based on merit. The same goes for compensation, benefits, promotions, transfers, reduction in workforce, recall, training, and education.

Posted 3 days ago

Apply

6.0 years

0 Lacs

India

Remote

Job Title: Data Scientist – Demand Forecasting Location: Remote Experience Required: 6+ Years About the Role MindBrain Innovations Pvt Ltd is seeking a skilled and driven Data Scientist – Demand Forecasting to join our advanced analytics team. This role is pivotal in developing accurate and scalable demand forecasting solutions to guide key business decisions across inventory planning, staffing, and financial forecasting. The ideal candidate will combine technical proficiency in machine learning and time-series forecasting with strong communication and analytical problem-solving skills. You’ll collaborate with cross-functional teams to implement data science solutions that directly impact strategic planning and operations. Key Responsibilities Develop and enhance time-series forecasting models using Python and SQL. Work with business stakeholders and software engineers to improve demand planning accuracy. Design, run, and analyze experiments to test improvements to current algorithms and forecasting strategies. Discover and integrate new data sources to improve model robustness and relevance. Translate complex model behavior into actionable business insights and clearly communicate underlying assumptions and limitations. Build and deploy scalable data pipelines that connect model development to production systems. Participate in cross-functional collaboration to ensure business users understand and trust forecast outputs. Stay current with industry trends and best practices to continually refine forecasting methodologies. Required Qualifications Master’s degree in Data Science, Statistics, Applied Mathematics, Computer Science, Engineering, Physics, or a related quantitative discipline. 5+ years of experience in data science, analytics, or a related field focused on statistical modeling and data extraction. Advanced programming skills in Python and strong proficiency in SQL . Experience with large-scale data processing tools (e.g., Hadoop, Hive, Scala). Deep knowledge of time-series forecasting techniques , multivariate algorithms, and model validation methods. Proficient in feature engineering , model tuning , and hyperparameter optimization . Ability to write clean, production-ready, and well-documented code. Strong communication skills to convey technical insights clearly to business and engineering teams. Experience building automated and production-grade data pipelines. Ability to work both independently and in a collaborative team environment. Preferred Qualifications Experience working in supply chain or demand planning environments. Familiarity with data visualization tools and dashboarding. Knowledge of the Azure cloud platform and its data services. Hands-on experience in building and deploying APIs for model integration. Prior involvement in stakeholder engagement and influencing decision-making without direct authority.

Posted 3 days ago

Apply

10.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Summary The role holder will be part of Wealth Management Core Platform Hive and be accountable for delivery of Core Platform hive changes as well as lead/support the achievement of Hive OKRs. As an empowered decision maker, Product owner will be accountable for maximising the business value of their product. We are looking for a talented individual who has functional and delivery expertise in front to back trade flow of one or more Capital Markets products, and, or Managed Investments with extensive experience in delivering booking and settlement initiatives on core booking platforms. The candidate must possess at least 10 years of experience in Banking / Wealth Management and have strong communication and stakeholder managements skills. Experience in Temenos applications required; solution designing is will be an advantage. Key Responsibilities PO-specific responsibilities Accountable for delivering their product's contribution to the Business plan and QPR scorecard Continuous Backlog management; expressing Backlog items clearly and in a consumable format Optimise value delivery through continuous improvement, gathering feedback from relevant stakeholders / SMEs / customers and prioritisation of backlog Create transparency around backlog item progress, blockers, impediments, dependencies Is a core member of the squad, 100% dedicated to the role Ensure the voices of clients and relevant stakeholders are represented Working with the Scrum Master and squad members to build an empowered, high performing team in a psychologically safe environment Ensure all artefacts and assurance deliverables are as per the required standards and policies (e.g. nWOWS, CGP) Ensure regular engagement and management of Process Change, Operational and Delivery Risk for their backlog (and all other relevant Risk requirements) Leading between 5 and 8 squads Strategy Manage and track execution progress of approved initiatives to drive the Transformation agenda. Support the development of the Core Wealth Platform strategic direction and roadmap, in alignment with the business strategy and investment appetite. Work with WM Hive leads to ensure project deliveries are effectively implemented across geographies. Business Maintain strong stakeholder engagement with WM Business, COO/ Operations, T&I, Risk & Compliance and Group Internal Audit to ensure alignment across stakeholder groups to support the tribe deliverables Ensure appropriate representation across the stakeholder groups in delivery forums. Escalate appropriately to ensure key stakeholders like Cluster Lead, Hive Lead, Hive Tech Lead and Chief product owner are updated and able to intervene as required. Processes Role holder will be execution of the strategy and identifying opportunities for streamline the operational processes through automation, OpEx and other initiatives. Continuously improve productivity and efficiency of operations and drive standardisation agenda for WM Core Wealth Platform Hive, maintaining rigorous cost and investment discipline across the business. Ensure appropriate and insightful data and analytics that can drive business decisions. Maintain strong stakeholder engagement with WM Business, COO/ Operations, T&I, Risk & Compliance and Group Internal Audit to ensure alignment across stakeholder groups to support the Hive deliverables People & Talent Demonstrate and act as a role model of the Group’s values and culture in the region Lead and support a change in mindset, building a culture of client centricity, agility, and accountability through standardised metrics and measurement Set effective metrics and standards, transparently communicating them to squads Ensure squad capacity is reviewed to enable delivery of client outcomes. Risk Management Risk control and governance: ensuring oversight and driving improvement in the control & resilience agenda. Developing a forward-looking end-to-end view across Wealth Management environment and proactively identifying and escalating issues and sharing themes / lessons learnt. Governance Adherence to policies and control standards, ensuring compliance and operation within risk tolerance and risk appetite. Maintain awareness and understanding of the regulatory framework in which the Bank operates, and the regulatory requirements and expectations relevant to the role. Responsible for delivering ‘effective governance’ within the deliverables and possessing the ability to constructively challenge relevant stakeholders and teams effectively. Ability and willingness to work through details with relevant control functions in an open and collaborative manner to achieve the desired governance outcome within the bank’s risk appetite. Work with global teams in Risk, Compliance and COO Office to ensure adherence to the Bank’s Risk framework, in the identification, assessment, mitigation, control and monitoring of risk. Regulatory & Business Conduct Display exemplary conduct and live by the Group’s Values and Code of Conduct. Take personal responsibility for embedding the highest standards of ethics, including regulatory and business conduct, across Standard Chartered Bank. This includes understanding and ensuring compliance with, in letter and spirit, all applicable laws, regulations, guidelines and the Group Code of Conduct. Effectively and collaboratively identify, escalate, mitigate and resolve risk, conduct and compliance matters. Key stakeholders Investment Advisors, Team Heads, Relationship Managers, (PvB and Retail) Global WM Product Teams Global and Country Technology teams Product Owners across Digital and Client Journey Country WM Product Heads Group, Regional and Country WM COO team Qualifications Certified Scrum Product Owner or comparable Product Owner certifications Skills And Experience Understanding Customer Needs WM Products and Processes Data Architecture Business Process Improvement Agile Project Management About Standard Chartered We're an international bank, nimble enough to act, big enough for impact. For more than 170 years, we've worked to make a positive difference for our clients, communities, and each other. We question the status quo, love a challenge and enjoy finding new opportunities to grow and do better than before. If you're looking for a career with purpose and you want to work for a bank making a difference, we want to hear from you. You can count on us to celebrate your unique talents and we can't wait to see the talents you can bring us. Our purpose, to drive commerce and prosperity through our unique diversity, together with our brand promise, to be here for good are achieved by how we each live our valued behaviours. When you work with us, you'll see how we value difference and advocate inclusion. Together We Do the right thing and are assertive, challenge one another, and live with integrity, while putting the client at the heart of what we do Never settle, continuously striving to improve and innovate, keeping things simple and learning from doing well, and not so well Are better together, we can be ourselves, be inclusive, see more good in others, and work collectively to build for the long term What We Offer In line with our Fair Pay Charter, we offer a competitive salary and benefits to support your mental, physical, financial and social wellbeing. Core bank funding for retirement savings, medical and life insurance, with flexible and voluntary benefits available in some locations. Time-off including annual leave, parental/maternity (20 weeks), sabbatical (12 months maximum) and volunteering leave (3 days), along with minimum global standards for annual and public holiday, which is combined to 30 days minimum. Flexible working options based around home and office locations, with flexible working patterns. Proactive wellbeing support through Unmind, a market-leading digital wellbeing platform, development courses for resilience and other human skills, global Employee Assistance Programme, sick leave, mental health first-aiders and all sorts of self-help toolkits A continuous learning culture to support your growth, with opportunities to reskill and upskill and access to physical, virtual and digital learning. Being part of an inclusive and values driven organisation, one that embraces and celebrates our unique diversity, across our teams, business functions and geographies - everyone feels respected and can realise their full potential.

Posted 3 days ago

Apply

175.0 years

0 Lacs

Gurugram, Haryana, India

On-site

At American Express, our culture is built on a 175-year history of innovation, shared values and Leadership Behaviors, and an unwavering commitment to back our customers, communities, and colleagues. As part of Team Amex, you'll experience this powerful backing with comprehensive support for your holistic well-being and many opportunities to learn new skills, develop as a leader, and grow your career. Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express. How will you make an impact in this role? The position is in Global Contact and Capacity Management (GCCM) . GCCM is responsible for all chat volume forecasting, capacity/staff planning, operational expense management, configuration, and real time performance management & monitoring for GSG across various markets globally. The group executes plans built by the Forecasting & Business Planning teams and manages 24/7 real-time performance in the voice and digital channels. The group ensures that robust schedules are designed to meet the demand of daily operations. The schedules are aligned to intraday/intraweek chat volume distributions for all markets and lines of business. The incumbent will be a part of the work force optimization pillar within Global Capacity & Contact Optimization team supporting Digital markets. Primary responsibilities would include short-term planning, scheduling, reporting and managing key performance indicators such as wait times, abandon rates, CHT, shrinkages and staffing optimization. Key Deliverables: · Interface with Analysts, Team leaders, and other members of management · Manage, update and report real-time activities in the department · Monitor Real Time Adherence (RTA) and communicate staffing discrepancies to Team Leaders · Record and Maintain a count of productive FTEs · Capacity Management for sub-processes · Work with Short Term Forecasting Team, for IDPs and Staffing · Leave Cap Formulation; provide advisory support on release of FTEs from the process · Communicate systems, voice response and Telecommunication issues to the department · Real time adherence, monitoring and communication. Raise awareness to RTA issues that are impacting service level and aging objectives · Proactively identify improvement opportunities on things such as shift mix, hours of operation etc. · Analyze and define at regular intervals, best time to contact Card members to improve total Contacts in the process. · In-bound chat pattern analysis, trending and staff alignment · Maintain strong relationships with the Team Leaders and SDL 's to improve overall understanding and awareness of daily/weekly business impacts · Feedback, Huddle timings, training schedules and other Off-The-Phone activities Minimum Qualifications Functional skills: · Bachelor’s degree (Mathematics / Statistics/ Data Analytics); MBA or equivalent is a plus · 2+ years of relevant experience in Workforce Planning/ Operations/MIS analytics would be preferred · Proficiency in Workforce Management tools such as Avaya, eWFM, Genesys as well as understanding of call center volume drivers and forecasting/workforce planning processes would be an added advantage · Strong written and verbal communication skills with demonstrated success in creating and conducting presentations to large / senior / challenging audiences, a plus · Strong Organizational and Project Management skills · Proven ability to manage multiple priorities effectively with a track record of driving results effectively while meeting deadlines · Strong relationship and collaboration skills, including the ability to work in a highly matrixed environment Behavioral Skills/Capabilities: · Delivers high quality work with direction and oversight · Understands work goals and seeks to understand its importance to the BU and/or the Blue Box · Feels comfortable taking decisions/ calculated risks based on facts and intuition · Flexible to quickly adjust around shifting priorities, multiple demands, ambiguity, and rapid change · Maintains a positive attitude when presented with a barrier · Demonstrated ability to challenge the status quo & build consensus Technical Skills/ Knowledge of platforms: · Proficiency with Microsoft Office, especially Excel, and PowerPoint · Working experience of Power BI would be needed · Project management skills, knowledge and experience of successfully leading projects, a plus · Ability to handle large data sets & prior programming experience in SAS, SQL, Python and/or HQL (Hive Query Language) to write codes independently and efficiently will be useful · Knowledge of machine learning will be an added advantage · Exposure to Big Data Platforms such as Cornerstone & visualization tools like Tableau, a nice to have We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations.

Posted 3 days ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Accellor is looking for a Data Engineer with extensive experience in developing ETL processes using PySpark Notebooks and Microsoft Fabric, and supporting existing legacy SQL Server environments. The ideal candidate will possess a strong background in Spark-based development, demonstrate a high proficiency in SQL, and be comfortable working independently, collaboratively within a team, or leading other developers when required. Design, develop, and maintain ETL pipelines using PySpark Notebooks and Microsoft Fabric Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver efficient data solutions Migrate and integrate data from legacy SQL Server environments into modern data platforms Optimize data pipelines and workflows for scalability, efficiency, and reliability Provide technical leadership and mentorship to junior developers and other team members Troubleshoot and resolve complex data engineering issues related to performance, data quality, and system scalability Develop, maintain, and enforce data engineering best practices, coding standards, and documentation Conduct code reviews and provide constructive feedback to improve team productivity and code quality Support data-driven decision-making processes by ensuring data integrity, availability, and consistency across different platforms Requirements Bachelor's or Master's degree in Computer Science, Data Science, Engineering, or a related field Experience with Microsoft Fabric or similar cloud-based data integration platforms is a must Min 3 years of experience in data engineering, with a strong focus on ETL development using PySpark or other Spark-based tools Proficiency in SQL with extensive experience in complex queries, performance tuning, and data modeling Strong knowledge of data warehousing concepts, ETL frameworks, and big data processing Familiarity with other data processing technologies (e.g., Hadoop, Hive, Kafka) is an advantage Experience working with both structured and unstructured data sources Excellent problem-solving skills and the ability to troubleshoot complex data engineering issues Proven ability to work independently, as part of a team, and in leadership roles Strong communication skills with the ability to translate complex technical concepts into business terms Mandatory Skills Experience with Data lake, Data warehouse, Delta lake Experience with Azure Data Services, including Azure Data Factory, Azure Synapse, or similar tools Knowledge of scripting languages (e.g., Python, Scala) for data manipulation and automation Familiarity with DevOps practices, CI/CD pipelines, and containerization (Docker, Kubernetes) is a plus Benefits Exciting Projects: We focus on industries like High-Tech, communication, media, healthcare, retail and telecom. Our customer list is full of fantastic global brands and leaders who love what we build for them. Collaborative Environment: You Can expand your skills by collaborating with a diverse team of highly talented people in an open, laidback environment — or even abroad in one of our global canters. Work-Life Balance: Accellor prioritizes work-life balance, which is why we offer flexible work schedules, opportunities to work from home, and paid time off and holidays. Professional Development: Our dedicated Learning & Development team regularly organizes Communication skills training, Stress Management program, professional certifications, and technical and soft skill trainings. Excellent Benefits: We provide our employees with competitive salaries, family medical insurance, Personal Accident Insurance, Periodic health awareness program, extended maternity leave, annual performance bonuses, and referral bonuses.

Posted 3 days ago

Apply

8.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Our Company Changing the world through digital experiences is what Adobe’s all about. We give everyone—from emerging artists to global brands—everything they need to design and deliver exceptional digital experiences! We’re passionate about empowering people to create beautiful and powerful images, videos, and apps, and transform how companies interact with customers across every screen. We’re on a mission to hire the very best and are committed to creating exceptional employee experiences where everyone is respected and has access to equal opportunity. We realize that new ideas can come from everywhere in the organization, and we know the next big idea could be yours! The Opportunity Our engineering team develops the Adobe Experience Platform, offering innovative data management and analytics. Developing a reliable, resilient system at large scale is crucial. We use Big Data and open-source tech for Adobe's services. Our support for large enterprise products spans across geographies, requiring us to manage disparate data sources and ingestion mechanisms. The data must be easily accessible at very low latency to support various scenarios and use cases. We seek candidates with deep expertise in building low latency services at high scales who can lead us in accomplishing our vision. What you will need to succeed 8+ years in design and development of data-driven large distributed systems 3+ years as an architect building large-scale data-intensive distributed systems and services Relevant experience building application layers on top of Apache Spark Strong experience with Hive SQL and Presto DB Experience leading architecture designs to approval while collaborating with multiple collaborators, dependencies, and internal/external customer requirements In-depth work experience with open-source technologies like Apache Kafka, Apache Spark, Kubernetes, etc. Experience with big data technologies on public clouds such as Azure, AWS, or Google Cloud Platform Experience with in-memory distributed caches like Redis, Memcached, etc. Strong coding (design patterns) and design proficiencies setting examples for others; contributions to open source are highly desirable Proficiency in data structures and algorithms Cost consciousness around computation and memory requirements Strong verbal and written communication skills BTech/MTech/MS in Computer Science What you'll do Lead the technical design and implementation strategy for major systems and components of the Adobe Experience Platform Evaluate and drive the architecture and technology choices for major systems/components Design, build, and deploy products with outstanding quality Innovate the current system to improve robustness, ease, and convenience Articulate design and code choices to cross-functional teams Mentor and guide a high-performing team Review and provide feedback on features, technology, architecture, design, time & budget estimates, and test strategies Engage in creative problem-solving Develop and evolve engineering standard methodologies to improve the team’s efficiency Partner with other teams across Adobe to achieve common goals Discover what makes Adobe a great place to work: Life @ Adobe Adobe is proud to be an Equal Employment Opportunity employer. We do not discriminate based on gender, race or color, ethnicity or national origin, age, disability, religion, sexual orientation, gender identity or expression, veteran status, or any other applicable characteristics protected by law. Learn more. Adobe aims to make Adobe.com accessible to any and all users. If you have a disability or special need that requires accommodation to navigate our website or complete the application process, email accommodations@adobe.com or call (408) 536-3015.

Posted 3 days ago

Apply

2.0 - 6.0 years

0 Lacs

pune, maharashtra

On-site

The Applications Development Supervisor role is an intermediate management position where you will lead and direct a team of employees to establish and implement new or revised application systems and programs in coordination with the Technology team. Your main objective will be to oversee applications systems analysis and programming activities. Your responsibilities will include managing an Applications Development team, recommending new work procedures for process efficiencies, resolving issues by identifying solutions based on technical experience, developing comprehensive knowledge of how your area integrates within apps development, ensuring quality of tasks provided by the team, acting as a backup to Applications Development Manager, and serving as an advisor to junior developers and analysts. You will also need to appropriately assess risk in business decisions, safeguarding Citigroup's reputation and assets by driving compliance with laws and regulations, adhering to policy, applying ethical judgment, and effectively supervising the activity of others. To qualify for this role, you should have 2-4 years of relevant experience, proficiency in Big Data, Spark, Hive, Hadoop, Python, Java, experience in managing and implementing successful projects, ability to make technical decisions on software development projects, knowledge of dependency management, change management, continuous integration testing tools, audit/compliance requirements, software engineering, and object-oriented design. Demonstrated leadership, management skills, and clear communication are essential. A Bachelors degree or equivalent experience is required for this position. Please note that this job description provides an overview of the work performed, and other job-related duties may be assigned as necessary. If you require a reasonable accommodation due to a disability to use our search tools or apply for a career opportunity, please review Accessibility at Citi. You can also view Citis EEO Policy Statement and the Know Your Rights poster.,

Posted 3 days ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

You will be joining our team as a Senior Data Scientist with expertise in Artificial Intelligence (AI) and Machine Learning (ML). The ideal candidate should possess a minimum of 5-7 years of experience in data science, focusing on AI/ML applications. You are expected to have a strong background in various ML algorithms, programming languages such as Python, R, or Scala, and data processing frameworks like Apache Spark. Proficiency in data visualization tools and experience in model deployment using Docker, Kubernetes, and cloud services will be essential for this role. Your responsibilities will include end-to-end AI/ML project delivery, from data processing to model deployment. You should have a good understanding of statistics, probability, and mathematical concepts used in AI/ML. Additionally, familiarity with big data tools, natural language processing techniques, time-series analysis, and MLOps will be advantageous. As a Senior Data Scientist, you are expected to lead cross-functional project teams and manage data science projects in a production setting. Your problem-solving skills, communication skills, and curiosity to stay updated with the latest advancements in AI and ML are crucial for success in this role. You should be able to convey technical insights clearly to diverse audiences and quickly adapt to new technologies. If you are an innovative, analytical, and collaborative team player with a proven track record in AI/ML project delivery, we invite you to apply for this exciting opportunity.,

Posted 3 days ago

Apply

10.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Our Company Changing the world through digital experiences is what Adobe’s all about. We give everyone—from emerging artists to global brands—everything they need to design and deliver exceptional digital experiences! We’re passionate about empowering people to create beautiful and powerful images, videos, and apps, and transform how companies interact with customers across every screen. We’re on a mission to hire the very best and are committed to creating exceptional employee experiences where everyone is respected and has access to equal opportunity. We realize that new ideas can come from everywhere in the organization, and we know the next big idea could be yours! About Connect Adobe Connect, within Adobe DALP BU is one of the best online webinar and training delivery platform. The product has a huge customer base which has been using it for many years. The product has evolved magnificently over a period of time ensuring it stay on top of the latest tech stack. It offers opportunity to look at plethora of technologies on both client and server side. What You’ll Do: Hands-on Machine Learning Engineer who will release models in production. Develop classifiers, predictive models, and multi-variate optimization algorithms on large-scale datasets using advanced statistical modeling, machine learning, and data mining. Special focus on R&D that will be building predictive models for conversion optimization, Bidding algorithms for pacing & optimization, Reinforcement learning problems, and Forecasting. Collaborate with Product Management to bring AI-based Assistive experiences to life. Socialize what’s possible now or in the near future to inform the roadmap. Responsible for driving all aspects of ML product development: ML modeling, data/ML pipelines, quality evaluations, productization, and ML Ops. Create and instill a team culture that focuses on sound scientific processes and encourages deep engagement with our customers. Handle project scope and risks with data, analytics, and creative problem-solving. What you require: Solid foundation in machine learning, classifiers, statistical modeling and multivariate optimization techniques Experience with control systems, reinforcement learning problems, and contextual bandit algos. Experience with DNN frameworks like TensorFlow or PyTorch on large-scale data sets. TensorFlow, R, scikit, pandas Proficient in one or more: Python, Java/Scala, SQL, Hive, Spark Good to have - Git, Docker, Kubernetes GenAI, RAG pipelines a must have technology Cloud based solutions is good to have General understanding of data structures, algorithms, multi-threaded programming, and distributed computing concepts Ability to be a self-starter and work closely with other data scientists and software engineers to design, test, and build production-ready ML and optimization models and distributed algorithms running on large-scale data sets. Ideal Candidate Profile: A total of 10+ years of experience, including at least 5 years in technical roles involving Data Science, Machine Learning, or Statistics. Masters or B.Tech in Computer Science/ Statistics Comfort with ambiguity, adaptability to evolving priorities, and the ability to lead a team while working autonomously. Proven management experience with highly diverse and global teams. Demonstrated ability to influence technical and non-technical stakeholders. Proven ability to effectively manage in a high-growth, matrixed organization. Track record of delivering cloud-scale, data-driven products, and services that are widely adopted with large customer bases. An ability to think strategically, look around corners, and create a vision for the current quarter, the year, and five years down the road. A relentless pursuit of great customer experiences and continuous improvements to the product. Adobe is proud to be an Equal Employment Opportunity employer. We do not discriminate based on gender, race or color, ethnicity or national origin, age, disability, religion, sexual orientation, gender identity or expression, veteran status, or any other applicable characteristics protected by law. Learn more. Adobe aims to make Adobe.com accessible to any and all users. If you have a disability or special need that requires accommodation to navigate our website or complete the application process, email accommodations@adobe.com or call (408) 536-3015.

Posted 3 days ago

Apply

3.0 - 7.0 years

0 Lacs

haryana

On-site

As a Data Analyst at our organization, you will play a crucial role in analyzing data to identify trends, patterns, and insights that inform business decisions. Your responsibilities will include setting up robust automated dashboards for performance management, developing and maintaining databases, and preparing reports for management that highlight trends, patterns, and predictions based on relevant data. To succeed in this role, you should possess strong problem-solving skills, advanced analytical skills with expertise in Excel, SQL, and Hive, and experience in handling large-scale datasets efficiently. Additionally, you should have excellent communication and project management abilities, along with the capability to interact with and influence business stakeholders effectively. Experience with web analytics platforms is a plus, and the ideal candidate will have a professional experience range of 3-6 years. Joining our team will provide you with the opportunity to be part of the largest fintech lending play in India. You will work in a fun, energetic, and once-in-a-lifetime environment that fosters your career growth and enables you to achieve your best possible outcome. With over 500 million registered users and 21 million merchants in our ecosystem, we are uniquely positioned to democratize credit for deserving consumers and merchants. You will be part of India's largest digital lending story and contribute to our commitment to this mission. Seize this opportunity to be a key player in our story!,

Posted 3 days ago

Apply

5.0 years

0 Lacs

India

Remote

Where you’ll work: India (Remote) Engineering at GoTo We’re the trailblazers of remote work technology. We build powerful, flexible work software that empowers everyone to live their best life, at work and beyond. And blaze even more trails along the way. There’s ample room for growth – so you can blaze your own trail here too. When you join a GoTo product team, you’ll take on a key role in this process and see your work be used by millions of users worldwide. Your Day to Day As a Senior Data Engineer, you would be: Design and Develop Pipelines : Build robust, scalable, and efficient ETL/ELT data pipelines to process structured data from diverse sources. Big Data Processing : Develop and optimize large-scale data workflows using Apache Spark, with strong hands-on experience in building ETL pipelines. Cloud-Native Data Solutions : Architect and implement data solutions using AWS services such as S3, EMR, Lambda, and EKS. Data Governance : Manage and govern data using catalogs like Hive or Unity Catalog; ensure strong data lineage, access controls, and metadata management. Workflow Orchestration : Schedule, monitor, and orchestrate workflows using Apache Airflow or similar tools. Data Quality & Monitoring : Implement quality checks, logging, monitoring, and alerting to ensure pipeline reliability and visibility. Cross-Functional Collaboration : Partner with analysts, data scientists, and business stakeholders to deliver high-quality data for applications and enable self-service BI. Compliance & Security : Uphold best practices in data governance, security, and compliance across the data ecosystem. Mentorship & Standards : Mentor junior engineers and help evolve engineering practices including CI/CD, testing, and documentation. What We’re Looking For As a Senior Data Engineer, your background will look like: Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field. 5+ years of experience in data engineering or software development, with a proven record of maintaining production-grade pipelines. Proficient in Python and SQL for data transformation and analytics. Strong expertise in Apache Spark , including data lake management, ACID transactions, schema enforcement/evolution, and time travel. In-depth knowledge of AWS services —especially S3, EMR, Lambda, and EKS—with a solid grasp of cloud architecture and security best practices. Solid data modeling skills (dimensional, normalized) and an understanding of data warehousing and lakehouse paradigms. Experience with BI tools like Tableau or Power BI . Familiar with setting up data quality , monitoring, and observability frameworks. Excellent communication and collaboration skills, with the ability to thrive in an agile and multicultural team environment. Nice to Have Experience working on the Databricks Platform Knowledge of Delta or Apache Iceberg file formats Passion for Machine Learning and AI; enthusiasm to explore and apply intelligent systems. What We Offer At GoTo, we believe in supporting our employees with a comprehensive range of benefits designed to fit your life—at work and beyond. Here are just some of the benefits and perks you can expect when you join our team: Comprehensive health benefits, life and disability insurance, and fertility and family-forming support program Generous paid time off, paid holidays, volunteer time off, and quarterly self-care days and no meeting days Tuition and reading reimbursement programs to support your continuous learning and professional growth Thrive Global Wellness Program, confidential Employee Assistance Program (EAP), as well as One to One Wellness Coaching Employee programs—including Employee Resource Groups (ERGs), GoTo Gives, and our charitable matching program—to amplify your connection and impact Registered Retirement Savings Plan (RRSP) to help you plan for your future GoTo performance bonus program to celebrate your impact and contributions Monthly remote work stipend to support your home office expenses At GoTo, you’ll find the flexibility, resources, and support you need to thrive—at work, at home, and everywhere in between. You’ll work towards a shared goal with an open-minded, cohesive team that’s greater than the sum of its parts. We’re committed to creating an inclusive space for everyone, because we know unique perspectives make us a stronger company and community. Join us and be part of a company that invests in your future, where together we’ll Be Real, Think Big, Move Fast, Keep Growing, and stay Customer Obsessed .Learn more.

Posted 3 days ago

Apply

2.0 - 9.0 years

0 Lacs

karnataka

On-site

We are seeking a Data Architect / Sr. Data and Pr. Data Architects to join our team. In this role, you will be involved in a combination of hands-on contribution, customer engagement, and technical team management. As a Data Architect, your responsibilities will include designing, architecting, deploying, and maintaining solutions on the MS Azure platform using various Cloud & Big Data Technologies. You will be managing the full life-cycle of Data Lake / Big Data solutions, starting from requirement gathering and analysis to platform selection, architecture design, and deployment. It will be your responsibility to implement scalable solutions on the Cloud and collaborate with a team of business domain experts, data scientists, and application developers to develop Big Data solutions. Moreover, you will be expected to explore and learn new technologies for creative problem solving and mentor a team of Data Engineers. The ideal candidate should possess strong hands-on experience in implementing Data Lake with technologies such as Data Factory (ADF), ADLS, Databricks, Azure Synapse Analytics, Event Hub & Streaming Analytics, Cosmos DB, and Purview. Additionally, experience with big data technologies like Hadoop (CDH or HDP), Spark, Airflow, NiFi, Kafka, Hive, HBase, MongoDB, Neo4J, Elastic Search, Impala, Sqoop, etc., is required. Proficiency in programming and debugging skills in Python and Scala/Java is essential, with experience in building REST services considered beneficial. Candidates should also have experience in supporting BI and Data Science teams in consuming data in a secure and governed manner, along with a good understanding of using CI/CD with Git, Jenkins / Azure DevOps. Experience in setting up cloud-computing infrastructure solutions, hands-on experience/exposure to NoSQL Databases, and Data Modelling in Hive are all highly valued. Applicants should have a minimum of 9 years of technical experience, with at least 5 years on MS Azure and 2 years on Hadoop (CDH/HDP).,

Posted 3 days ago

Apply

9.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Role Description Tech Lead – Azure/Snowflake & AWS Migration Key Responsibilities Design and develop scalable data pipelines using Snowflake as the primary data platform, integrating with tools like Azure Data Factory, Synapse Analytics, and AWS services. Build robust, efficient SQL and Python-based data transformations for cleansing, enrichment, and integration of large-scale datasets. Lead migration initiatives from AWS-based data platforms to a Snowflake-centered architecture, including: Rebuilding AWS Glue pipelines in Azure Data Factory or using Snowflake-native ELT approaches. Migrating EMR Spark jobs to Snowflake SQL or Python-based pipelines. Migrating Redshift workloads to Snowflake with schema conversion and performance optimization. Transitioning S3-based data lakes (Hudi, Hive) to Snowflake external tables via ADLS Gen2 or Azure Blob Storage. Redirecting Kinesis/MSK streaming data to Azure Event Hubs, followed by ingestion into Snowflake using Streams & Tasks or Snowpipe. Support database migrations from AWS RDS (Aurora PostgreSQL, MySQL, Oracle) to Snowflake, focusing on schema translation, compatibility handling, and data movement at scale. Design modern Snowflake lakehouse-style architectures that incorporate raw, staging, and curated zones, with support for time travel, cloning, zero-copy restore, and data sharing. Integrate Azure Functions or Logic Apps with Snowflake for orchestration and trigger-based automation. Implement security best practices, including Azure Key Vault integration and Snowflake role-based access control, data masking, and network policies. Optimize Snowflake performance and costs using clustering, multi-cluster warehouses, materialized views, and result caching. Support CI/CD processes for Snowflake pipelines using Git, Azure DevOps or GitHub Actions, and SQL code versioning. Maintain well-documented data engineering workflows, architecture diagrams, and technical documentation to support collaboration and long-term platform maintainability. Required Qualifications 9+ years of data engineering experience, with 3+ years on Microsoft Azure stack and hands-on Snowflake expertise. Proficiency in: Python for scripting and ETL orchestration SQL for complex data transformation and performance tuning in Snowflake Azure Data Factory and Synapse Analytics (SQL Pools) Experience in migrating workloads from AWS to Azure/Snowflake, including services such as Glue, EMR, Redshift, Lambda, Kinesis, S3, and MSK. Strong understanding of cloud architecture and hybrid data environments across AWS and Azure. Hands-on experience with database migration, schema conversion, and tuning in PostgreSQL, MySQL, and Oracle RDS. Familiarity with Azure Event Hubs, Logic Apps, and Key Vault. Working knowledge of CI/CD, version control (Git), and DevOps principles applied to data engineering workloads. Preferred Qualifications Extensive experience with Snowflake Streams, Tasks, Snowpipe, external tables, and data sharing. Exposure to MSK-to-Event Hubs migration and streaming data integration into Snowflake. Familiarity with Terraform or ARM templates for Infrastructure-as-Code (IaC) in Azure environments. Certification such as SnowPro Core, Azure Data Engineer Associate, or equivalent. Skills Azure,AWS REDSHIFT,Athena,Azure Data Lake

Posted 3 days ago

Apply

9.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Role Description Tech Lead – Azure/Snowflake & AWS Migration Key Responsibilities Design and develop scalable data pipelines using Snowflake as the primary data platform, integrating with tools like Azure Data Factory, Synapse Analytics, and AWS services. Build robust, efficient SQL and Python-based data transformations for cleansing, enrichment, and integration of large-scale datasets. Lead migration initiatives from AWS-based data platforms to a Snowflake-centered architecture, including: Rebuilding AWS Glue pipelines in Azure Data Factory or using Snowflake-native ELT approaches. Migrating EMR Spark jobs to Snowflake SQL or Python-based pipelines. Migrating Redshift workloads to Snowflake with schema conversion and performance optimization. Transitioning S3-based data lakes (Hudi, Hive) to Snowflake external tables via ADLS Gen2 or Azure Blob Storage. Redirecting Kinesis/MSK streaming data to Azure Event Hubs, followed by ingestion into Snowflake using Streams & Tasks or Snowpipe. Support database migrations from AWS RDS (Aurora PostgreSQL, MySQL, Oracle) to Snowflake, focusing on schema translation, compatibility handling, and data movement at scale. Design modern Snowflake lakehouse-style architectures that incorporate raw, staging, and curated zones, with support for time travel, cloning, zero-copy restore, and data sharing. Integrate Azure Functions or Logic Apps with Snowflake for orchestration and trigger-based automation. Implement security best practices, including Azure Key Vault integration and Snowflake role-based access control, data masking, and network policies. Optimize Snowflake performance and costs using clustering, multi-cluster warehouses, materialized views, and result caching. Support CI/CD processes for Snowflake pipelines using Git, Azure DevOps or GitHub Actions, and SQL code versioning. Maintain well-documented data engineering workflows, architecture diagrams, and technical documentation to support collaboration and long-term platform maintainability. Required Qualifications 9+ years of data engineering experience, with 3+ years on Microsoft Azure stack and hands-on Snowflake expertise. Proficiency in: Python for scripting and ETL orchestration SQL for complex data transformation and performance tuning in Snowflake Azure Data Factory and Synapse Analytics (SQL Pools) Experience in migrating workloads from AWS to Azure/Snowflake, including services such as Glue, EMR, Redshift, Lambda, Kinesis, S3, and MSK. Strong understanding of cloud architecture and hybrid data environments across AWS and Azure. Hands-on experience with database migration, schema conversion, and tuning in PostgreSQL, MySQL, and Oracle RDS. Familiarity with Azure Event Hubs, Logic Apps, and Key Vault. Working knowledge of CI/CD, version control (Git), and DevOps principles applied to data engineering workloads. Preferred Qualifications Extensive experience with Snowflake Streams, Tasks, Snowpipe, external tables, and data sharing. Exposure to MSK-to-Event Hubs migration and streaming data integration into Snowflake. Familiarity with Terraform or ARM templates for Infrastructure-as-Code (IaC) in Azure environments. Certification such as SnowPro Core, Azure Data Engineer Associate, or equivalent. Senior Data Engineer – Azure/Snowflake Migration Key Responsibilities Design and develop scalable data pipelines using Snowflake as the primary data platform, integrating with tools like Azure Data Factory, Synapse Analytics, and AWS services. Build robust, efficient SQL and Python-based data transformations for cleansing, enrichment, and integration of large-scale datasets. Lead migration initiatives from AWS-based data platforms to a Snowflake-centered architecture, including: Rebuilding AWS Glue pipelines in Azure Data Factory or using Snowflake-native ELT approaches. Migrating EMR Spark jobs to Snowflake SQL or Python-based pipelines. Migrating Redshift workloads to Snowflake with schema conversion and performance optimization. Transitioning S3-based data lakes (Hudi, Hive) to Snowflake external tables via ADLS Gen2 or Azure Blob Storage. Redirecting Kinesis/MSK streaming data to Azure Event Hubs, followed by ingestion into Snowflake using Streams & Tasks or Snowpipe. Support database migrations from AWS RDS (Aurora PostgreSQL, MySQL, Oracle) to Snowflake, focusing on schema translation, compatibility handling, and data movement at scale. Design modern Snowflake lakehouse-style architectures that incorporate raw, staging, and curated zones, with support for time travel, cloning, zero-copy restore, and data sharing. Integrate Azure Functions or Logic Apps with Snowflake for orchestration and trigger-based automation. Implement security best practices, including Azure Key Vault integration and Snowflake role-based access control, data masking, and network policies. Optimize Snowflake performance and costs using clustering, multi-cluster warehouses, materialized views, and result caching. Support CI/CD processes for Snowflake pipelines using Git, Azure DevOps or GitHub Actions, and SQL code versioning. Maintain well-documented data engineering workflows, architecture diagrams, and technical documentation to support collaboration and long-term platform maintainability. Required Qualifications 7+ years of data engineering experience, with 3+ years on Microsoft Azure stack and hands-on Snowflake expertise. Proficiency in: Python for scripting and ETL orchestration SQL for complex data transformation and performance tuning in Snowflake Azure Data Factory and Synapse Analytics (SQL Pools) Experience in migrating workloads from AWS to Azure/Snowflake, including services such as Glue, EMR, Redshift, Lambda, Kinesis, S3, and MSK. Strong understanding of cloud architecture and hybrid data environments across AWS and Azure. Hands-on experience with database migration, schema conversion, and tuning in PostgreSQL, MySQL, and Oracle RDS. Familiarity with Azure Event Hubs, Logic Apps, and Key Vault. Working knowledge of CI/CD, version control (Git), and DevOps principles applied to data engineering workloads. Preferred Qualifications Extensive experience with Snowflake Streams, Tasks, Snowpipe, external tables, and data sharing. Exposure to MSK-to-Event Hubs migration and streaming data integration into Snowflake. Familiarity with Terraform or ARM templates for Infrastructure-as-Code (IaC) in Azure environments. Certification such as SnowPro Core, Azure Data Engineer Associate, or equivalent. Skills Aws,Azure Data Lake,Python

Posted 3 days ago

Apply

0.0 years

0 Lacs

Varthur, Bengaluru, Karnataka

On-site

Outer Ring Road, Devarabisanahalli Vlg Varthur Hobli, Bldg 2A, Twr 3, Phs 1, BANGALORE, IN, 560103 INFORMATION TECHNOLOGY 4230 Band B Satyanarayana Ambati Job Description Application Developer Bangalore, Karnataka, India AXA XL offers risk transfer and risk management solutions to clients globally. We offer worldwide capacity, flexible underwriting solutions, a wide variety of client-focused loss prevention services and a team-based account management approach. AXA XL recognizes data and information as critical business assets, both in terms of managing risk and enabling new business opportunities. This data should not only be high quality, but also actionable – enabling AXA XL’s executive leadership team to maximize benefits and facilitate sustained advantage. What you’ll be DOING What will your essential responsibilities include? We are seeking an experienced ETL Developer to support and evolve our enterprise data integration workflows. The ideal candidate will have deep expertise in Informatica PowerCenter, strong hands-on experience with Azure Data Factory and Databricks, and a passion for building scalable, reliable ETL pipelines. This role is critical for both day-to-day operational reliability and long-term modernization of our data engineering stack in the Azure cloud. Key Responsibilities: Maintain, monitor, and troubleshoot existing Informatica PowerCenter ETL workflows to ensure operational reliability and data accuracy. Enhance and extend ETL processes to support new data sources, updated business logic, and scalability improvements. Develop and orchestrate PySpark notebooks in Azure Databricks for data transformation, cleansing, and enrichment. Configure and manage Databricks clusters for performance optimization and cost efficiency. Implement Delta Lake solutions that support ACID compliance, versioning, and time travel for reliable data lake operations. Automate data workflows using Databricks Jobs and Azure Data Factory (ADF) pipelines. Design and manage scalable ADF pipelines, including parameterized workflows and reusable integration patterns. Integrate with Azure Blob Storage and ADLS Gen2 using Spark APIs for high-performance data ingestion and output. Ensure data quality, consistency, and governance across legacy and cloud-based pipelines. Collaborate with data analysts, engineers, and business teams to deliver clean, validated data for reporting and analytics. Participate in the full Software Development Life Cycle (SDLC) from design through deployment, with an emphasis on maintainability and audit readiness. Develop maintainable and efficient ETL logic and scripts following best practices in security and performance. Troubleshoot pipeline issues across data infrastructure layers, identifying and resolving root causes to maintain reliability. Create and maintain clear documentation of technical designs, workflows, and data processing logic for long-term maintainability and knowledge sharing. Stay informed on emerging cloud and data engineering technologies to recommend improvements and drive innovation. Follow internal controls, audit protocols, and secure data handling procedures to support compliance and operational standards. Provide accurate time and effort estimates for assigned development tasks, accounting for complexity and risk. What you will BRING We’re looking for someone who has these abilities and skills: Advanced experience with Informatica PowerCenter, including mappings, workflows, session tuning, and parameterization Expertise in Azure Databricks + PySpark, including: Notebook development Cluster configuration and tuning Delta Lake (ACID, versioning, time travel) Job orchestration via Databricks Jobs or ADF Integration with Azure Blob Storage and ADLS Gen2 using Spark APIs Strong hands-on experience with Azure Data Factory: Building and managing pipelines Parameterization and dynamic datasets Notebook integration and pipeline monitoring Proficiency in SQL, PL/SQL, and scripting languages such as Python, Bash, or PowerShell Strong understanding of data warehousing, dimensional modeling, and data profiling Familiarity with Git, CI/CD pipelines, and modern DevOps practices Working knowledge of data governance, audit trails, metadata management, and compliance standards such as HIPAA and GDPR Effective problem-solving and troubleshooting skills with the ability to resolve performance bottlenecks and job failures Awareness of Azure Functions, App Services, API Management, and Application Insights Understanding of Azure Key Vault for secrets and credential management Familiarity with Spark-based big data ecosystems (e.g., Hive, Kafka) is a plus Who WE are AXA XL, the P&C and specialty risk division of AXA, is known for solving complex risks. For mid-sized companies, multinationals and even some inspirational individuals we don’t just provide re/insurance, we reinvent it. How? By combining a comprehensive and efficient capital platform, data-driven insights, leading technology, and the best talent in an agile and inclusive workspace, empowered to deliver top client service across all our lines of business property, casualty, professional, financial lines and specialty. With an innovative and flexible approach to risk solutions, we partner with those who move the world forward. Learn more at axaxl.com What we OFFER Inclusion AXA XL is committed to equal employment opportunity and will consider applicants regardless of gender, sexual orientation, age, ethnicity and origins, marital status, religion, disability, or any other protected characteristic. At AXA XL, we know that an inclusive culture and enables business growth and is critical to our success. That’s why we have made a strategic commitment to attract, develop, advance and retain the most inclusive workforce possible, and create a culture where everyone can bring their full selves to work and reach their highest potential. It’s about helping one another — and our business — to move forward and succeed. Five Business Resource Groups focused on gender, LGBTQ+, ethnicity and origins, disability and inclusion with 20 Chapters around the globe. Robust support for Flexible Working Arrangements Enhanced family-friendly leave benefits Named to the Diversity Best Practices Index Signatory to the UK Women in Finance Charter Learn more at axaxl.com/about-us/inclusion-and-diversity. AXA XL is an Equal Opportunity Employer. Total Rewards AXA XL’s Reward program is designed to take care of what matters most to you, covering the full picture of your health, wellbeing, lifestyle and financial security. It provides competitive compensation and personalized, inclusive benefits that evolve as you do. We’re committed to rewarding your contribution for the long term, so you can be your best self today and look forward to the future with confidence. Sustainability At AXA XL, Sustainability is integral to our business strategy. In an ever-changing world, AXA XL protects what matters most for our clients and communities. We know that sustainability is at the root of a more resilient future. Our 2023-26 Sustainability strategy, called “Roots of resilience”, focuses on protecting natural ecosystems, addressing climate change, and embedding sustainable practices across our operations. Our Pillars: Valuing nature: How we impact nature affects how nature impacts us. Resilient ecosystems - the foundation of a sustainable planet and society – are essential to our future. We’re committed to protecting and restoring nature – from mangrove forests to the bees in our backyard – by increasing biodiversity awareness and inspiring clients and colleagues to put nature at the heart of their plans. Addressing climate change: The effects of a changing climate are far-reaching and significant. Unpredictable weather, increasing temperatures, and rising sea levels cause both social inequalities and environmental disruption. We're building a net zero strategy, developing insurance products and services, and mobilizing to advance thought leadership and investment in societal-led solutions. Integrating ESG: All companies have a role to play in building a more resilient future. Incorporating ESG considerations into our internal processes and practices builds resilience from the roots of our business. We’re training our colleagues, engaging our external partners, and evolving our sustainability governance and reporting. AXA Hearts in Action : We have established volunteering and charitable giving programs to help colleagues support causes that matter most to them, known as AXA XL’s “Hearts in Action” programs. These include our Matching Gifts program, Volunteering Leave, and our annual volunteering day – the Global Day of Giving. For more information, please see axaxl.com/sustainability.

Posted 3 days ago

Apply

0.0 - 5.0 years

0 Lacs

Pune, Maharashtra

Remote

R022242 Pune, Maharashtra, India Engineering Regular Location Details: Pune, India This is a hybrid position. You’ll divide your time between working remotely from your home and an office, so you should live within commuting distance. Hybrid teams may work in-office as much as a few times a week or as little as once a month or quarter, as decided by leadership. The hiring manager can share more about what hybrid work might look like for this team Join our Team Are you excited about building world-class software solutions that empower millions of customers globally? At GoDaddy, our engineers are at the forefront of developing innovative platforms that drive our core domain businesses. We are seeking a skilled Senior Machine Learning Scientist to join our Domain Search team, where you will design, build, and maintain the foundational systems and services powering GoDaddy’s search, ML, and GenAI platforms. In this role, you will develop and apply machine learning and LLM-based methods to improve our customers’ search experience and play a major part in improving the search page across all markets we serve. Whether you’re passionate about crafting highly scalable systems or developing seamless customer experiences, your work will be critical to ensuring performance, scalability, and reliability for our customers worldwide. Join us and help craft the future of software at GoDaddy! What you'll get to do... Work with the latest deep learning and search technologies to develop and optimize advanced machine learning models to improve our customers’ experience Be self-driven, understand the data we have, and provide data-driven insights to all of our challenges Mine datasets to develop features and models to improve search relevance and ranking algorithms Design and analyze experiments to test new product ideas Understand patterns and insights about what our users search for and purchase to help personalize our recommendations Your experience should include... 5 years of industry experience in deep learning and software development Skilled in machine learning, statistics, and natural language processing (NLP) Proficient with deep learning frameworks such as PyTorch and handling large datasets Experienced in programming languages like Python, Java, or similar Familiar with large-scale data analytics using Spark You might also have... Ph.D. in a related field preferred Experience with Amazon AWS, containerized solutions, and both SQL and NoSQL databases Strong understanding of software security standard processes Experience with Hadoop technologies such as Spark, Hive, and other big data tools; data analytics and machine learning experience are a plus Experience with Elastic Search and search technologies is a plus, with a passion for developing innovative solutions for real-world business problems We've got your back... We offer a range of total rewards that may include paid time off, retirement savings (e.g., 401k, pension schemes), bonus/incentive eligibility, equity grants, participation in our employee stock purchase plan, competitive health benefits, and other family-friendly benefits including parental leave. GoDaddy’s benefits vary based on individual role and location and can be reviewed in more detail during the interview process We also embrace our diverse culture and offer a range of Employee Resource Groups (Culture). Have a side hustle? No problem. We love entrepreneurs! Most importantly, come as you are and make your own way About us... GoDaddy is empowering everyday entrepreneurs around the world by providing the help and tools to succeed online, making opportunity more inclusive for all. GoDaddy is the place people come to name their idea, build a professional website, attract customers, sell their products and services, and manage their work. Our mission is to give our customers the tools, insights, and people to transform their ideas and personal initiative into success. To learn more about the company, visit About Us At GoDaddy, we know diverse teams build better products—period. Our people and culture reflect and celebrate that sense of diversity and inclusion in ideas, experiences and perspectives. But we also know that’s not enough to build true equity and belonging in our communities. That’s why we prioritize integrating diversity, equity, inclusion and belonging principles into the core of how we work every day—focusing not only on our employee experience, but also our customer experience and operations. It’s the best way to serve our mission of empowering entrepreneurs everywhere, and making opportunity more inclusive for all. To read more about these commitments, as well as our representation and pay equity data, check out our Diversity and Pay Parity annual report which can be found on our Diversity Careers page GoDaddy is proud to be an equal opportunity employer . GoDaddy will consider for employment qualified applicants with criminal histories in a manner consistent with local and federal requirements. Refer to our full EEO policy Our recruiting team is available to assist you in completing your application. If they could be helpful, please reach out to myrecruiter@godaddy.com GoDaddy doesn’t accept unsolicited resumes from recruiters or employment agencies

Posted 3 days ago

Apply

2.0 years

0 Lacs

Pune, Maharashtra

Remote

R022243 Pune, Maharashtra, India Engineering Regular Location Details: Pune, India This is a hybrid position. You’ll divide your time between working remotely from your home and an office, so you should live within commuting distance. Hybrid teams may work in-office as much as a few times a week or as little as once a month or quarter, as decided by leadership. The hiring manager can share more about what hybrid work might look like for this team Join our Team Are you excited about building world-class software solutions that power millions of customers globally? At GoDaddy, our engineers are at the forefront of crafting innovative platforms that drive our core domain businesses, and we’re looking for dedicated professionals to help us craft the future of software. Whether you’re passionate about developing highly scalable systems, seamless customer experiences, or advanced machine learning and LLM-based methods to improve the search experience, we have a place for you! As part of our Domain Search, Registrars, and Investors teams, you’ll work on impactful products like our domain name search engine, registration and management services, high-scale DNS, investor experience, and personalization through ML models. You’ll play a key role in improving the search page for customers worldwide, owning the design, code, and data quality of your products end-to-end. We value strong software engineers with experience in microservices, cloud computing, distributed systems, data processing, and customer focus—and we’re flexible regarding your technology background. Join a small, high-impact team of dedicated engineers as we build and iterate upon the world’s largest domain name registrar services and secondary marketplace What you'll get to do... Develop and maintain scalable, cloud-ready applications and APIs, contributing across the full technology stack, including persistence and service layers Leverage data analytics and ETL processes to transform, enrich, and improve product and customer experience in both batch and streaming scenarios Ensure high code quality through unit/integration testing, code reviews, and consistency with standard methodologies Lead technical projects through architecture, design, and implementation phases, solving end-to-end problems Collaborate effectively with distributed teams Your experience should include... 2+ years of industrial experience with a strong background in deep learning and software development Skilled in machine learning, statistics, and natural language processing (NLP) Hands-on experience with deep learning frameworks such as PyTorch and working with large datasets Proficient in programming languages such as Python or Java Familiar with large-scale data analytics using Spark You might also have... Experience with AWS and containerized solutions Proficient in both SQL and NoSQL databases Strong understanding of software security standard processes Experience with Hadoop technologies (e.g., Spark, Hive) and big data analytics; ML and search technologies (e.g., Elastic Search) are a plus We've got your back... We offer a range of total rewards that may include paid time off, retirement savings (e.g., 401k, pension schemes), bonus/incentive eligibility, equity grants, participation in our employee stock purchase plan, competitive health benefits, and other family-friendly benefits including parental leave. GoDaddy’s benefits vary based on individual role and location and can be reviewed in more detail during the interview process We also embrace our diverse culture and offer a range of Employee Resource Groups (Culture). Have a side hustle? No problem. We love entrepreneurs! Most importantly, come as you are and make your own way About us... GoDaddy is empowering everyday entrepreneurs around the world by providing the help and tools to succeed online, making opportunity more inclusive for all. GoDaddy is the place people come to name their idea, build a professional website, attract customers, sell their products and services, and manage their work. Our mission is to give our customers the tools, insights, and people to transform their ideas and personal initiative into success. To learn more about the company, visit About Us At GoDaddy, we know diverse teams build better products—period. Our people and culture reflect and celebrate that sense of diversity and inclusion in ideas, experiences and perspectives. But we also know that’s not enough to build true equity and belonging in our communities. That’s why we prioritize integrating diversity, equity, inclusion and belonging principles into the core of how we work every day—focusing not only on our employee experience, but also our customer experience and operations. It’s the best way to serve our mission of empowering entrepreneurs everywhere, and making opportunity more inclusive for all. To read more about these commitments, as well as our representation and pay equity data, check out our Diversity and Pay Parity annual report which can be found on our Diversity Careers page GoDaddy is proud to be an equal opportunity employer . GoDaddy will consider for employment qualified applicants with criminal histories in a manner consistent with local and federal requirements. Refer to our full EEO policy Our recruiting team is available to assist you in completing your application. If they could be helpful, please reach out to myrecruiter@godaddy.com GoDaddy doesn’t accept unsolicited resumes from recruiters or employment agencies

Posted 3 days ago

Apply

0.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Job Description: Application Developer Bangalore, Karnataka, India AXA XL offers risk transfer and risk management solutions to clients globally. We offer worldwide capacity, flexible underwriting solutions, a wide variety of client-focused loss prevention services and a team-based account management approach. AXA XL recognizes data and information as critical business assets, both in terms of managing risk and enabling new business opportunities. This data should not only be high quality, but also actionable – enabling AXA XL’s executive leadership team to maximize benefits and facilitate sustained advantage. What you’ll be DOING What will your essential responsibilities include? We are seeking an experienced ETL Developer to support and evolve our enterprise data integration workflows. The ideal candidate will have deep expertise in Informatica PowerCenter, strong hands-on experience with Azure Data Factory and Databricks, and a passion for building scalable, reliable ETL pipelines. This role is critical for both day-to-day operational reliability and long-term modernization of our data engineering stack in the Azure cloud. Key Responsibilities: Maintain, monitor, and troubleshoot existing Informatica PowerCenter ETL workflows to ensure operational reliability and data accuracy. Enhance and extend ETL processes to support new data sources, updated business logic, and scalability improvements. Develop and orchestrate PySpark notebooks in Azure Databricks for data transformation, cleansing, and enrichment. Configure and manage Databricks clusters for performance optimization and cost efficiency. Implement Delta Lake solutions that support ACID compliance, versioning, and time travel for reliable data lake operations. Automate data workflows using Databricks Jobs and Azure Data Factory (ADF) pipelines. Design and manage scalable ADF pipelines, including parameterized workflows and reusable integration patterns. Integrate with Azure Blob Storage and ADLS Gen2 using Spark APIs for high-performance data ingestion and output. Ensure data quality, consistency, and governance across legacy and cloud-based pipelines. Collaborate with data analysts, engineers, and business teams to deliver clean, validated data for reporting and analytics. Participate in the full Software Development Life Cycle (SDLC) from design through deployment, with an emphasis on maintainability and audit readiness. Develop maintainable and efficient ETL logic and scripts following best practices in security and performance. Troubleshoot pipeline issues across data infrastructure layers, identifying and resolving root causes to maintain reliability. Create and maintain clear documentation of technical designs, workflows, and data processing logic for long-term maintainability and knowledge sharing. Stay informed on emerging cloud and data engineering technologies to recommend improvements and drive innovation. Follow internal controls, audit protocols, and secure data handling procedures to support compliance and operational standards. Provide accurate time and effort estimates for assigned development tasks, accounting for complexity and risk. What you will BRING We’re looking for someone who has these abilities and skills: Advanced experience with Informatica PowerCenter, including mappings, workflows, session tuning, and parameterization Expertise in Azure Databricks + PySpark, including: Notebook development Cluster configuration and tuning Delta Lake (ACID, versioning, time travel) Job orchestration via Databricks Jobs or ADF Integration with Azure Blob Storage and ADLS Gen2 using Spark APIs Strong hands-on experience with Azure Data Factory: Building and managing pipelines Parameterization and dynamic datasets Notebook integration and pipeline monitoring Proficiency in SQL, PL/SQL, and scripting languages such as Python, Bash, or PowerShell Strong understanding of data warehousing, dimensional modeling, and data profiling Familiarity with Git, CI/CD pipelines, and modern DevOps practices Working knowledge of data governance, audit trails, metadata management, and compliance standards such as HIPAA and GDPR Effective problem-solving and troubleshooting skills with the ability to resolve performance bottlenecks and job failures Awareness of Azure Functions, App Services, API Management, and Application Insights Understanding of Azure Key Vault for secrets and credential management Familiarity with Spark-based big data ecosystems (e.g., Hive, Kafka) is a plus Who WE are AXA XL, the P&C and specialty risk division of AXA, is known for solving complex risks. For mid-sized companies, multinationals and even some inspirational individuals we don’t just provide re/insurance, we reinvent it. How? By combining a comprehensive and efficient capital platform, data-driven insights, leading technology, and the best talent in an agile and inclusive workspace, empowered to deliver top client service across all our lines of business property, casualty, professional, financial lines and specialty. With an innovative and flexible approach to risk solutions, we partner with those who move the world forward. Learn more at axaxl.com What we OFFER Inclusion AXA XL is committed to equal employment opportunity and will consider applicants regardless of gender, sexual orientation, age, ethnicity and origins, marital status, religion, disability, or any other protected characteristic. At AXA XL, we know that an inclusive culture and enables business growth and is critical to our success. That’s why we have made a strategic commitment to attract, develop, advance and retain the most inclusive workforce possible, and create a culture where everyone can bring their full selves to work and reach their highest potential. It’s about helping one another — and our business — to move forward and succeed. Five Business Resource Groups focused on gender, LGBTQ+, ethnicity and origins, disability and inclusion with 20 Chapters around the globe. Robust support for Flexible Working Arrangements Enhanced family-friendly leave benefits Named to the Diversity Best Practices Index Signatory to the UK Women in Finance Charter Learn more at axaxl.com/about-us/inclusion-and-diversity. AXA XL is an Equal Opportunity Employer. Total Rewards AXA XL’s Reward program is designed to take care of what matters most to you, covering the full picture of your health, wellbeing, lifestyle and financial security. It provides competitive compensation and personalized, inclusive benefits that evolve as you do. We’re committed to rewarding your contribution for the long term, so you can be your best self today and look forward to the future with confidence. Sustainability At AXA XL, Sustainability is integral to our business strategy. In an ever-changing world, AXA XL protects what matters most for our clients and communities. We know that sustainability is at the root of a more resilient future. Our 2023-26 Sustainability strategy, called “Roots of resilience”, focuses on protecting natural ecosystems, addressing climate change, and embedding sustainable practices across our operations. Our Pillars: Valuing nature: How we impact nature affects how nature impacts us. Resilient ecosystems - the foundation of a sustainable planet and society – are essential to our future. We’re committed to protecting and restoring nature – from mangrove forests to the bees in our backyard – by increasing biodiversity awareness and inspiring clients and colleagues to put nature at the heart of their plans. Addressing climate change: The effects of a changing climate are far-reaching and significant. Unpredictable weather, increasing temperatures, and rising sea levels cause both social inequalities and environmental disruption. We're building a net zero strategy, developing insurance products and services, and mobilizing to advance thought leadership and investment in societal-led solutions. Integrating ESG: All companies have a role to play in building a more resilient future. Incorporating ESG considerations into our internal processes and practices builds resilience from the roots of our business. We’re training our colleagues, engaging our external partners, and evolving our sustainability governance and reporting. AXA Hearts in Action : We have established volunteering and charitable giving programs to help colleagues support causes that matter most to them, known as AXA XL’s “Hearts in Action” programs. These include our Matching Gifts program, Volunteering Leave, and our annual volunteering day – the Global Day of Giving. For more information, please see axaxl.com/sustainability.

Posted 3 days ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

As a member of the Enterprise Data Platform (EDP) team at Macquarie, you will play a crucial role in managing Macquarie's Corporate Data Platform. The businesses supported by the platform rely heavily on it for various use cases such as data science, self-service analytics, operational analytics, and reporting. At Macquarie, we believe in leveraging the strengths of diverse individuals and empowering them to explore endless possibilities. With a global presence in 31 markets and a history of 56 years of unbroken profitability, you will join a collaborative and supportive team where every member contributes ideas and drives positive outcomes. In this role, your responsibilities will include delivering new platform capabilities utilizing AWS and Kubernetes to enhance resilience and capabilities that redefine how the business leverages the platform. You will be involved in deploying tools, introducing new technologies, and automating processes to enhance efficiency. Additionally, you will focus on improving CI/CD pipelines, supporting platform applications, and ensuring smooth operations. To be successful in this role, you should have at least 3 years of experience in Cloud, DevOps, or Data Engineering with hands-on proficiency in AWS and Kubernetes. You should also possess expertise in Big Data technologies like Hive, Spark, and Presto, along with strong scripting skills in Python and Bash. A background in DevOps, Agile, Scrum, and Continuous Delivery environments is essential, along with excellent communication skills to collaborate effectively with cross-functional teams. Your passion for problem-solving, continuous learning, and keen interest in Big Data and Cloud technologies will be invaluable in this role. At Macquarie, we value individuals who are enthusiastic about building a better future with us. If you are excited about this opportunity and working at Macquarie, we encourage you to apply. As part of Macquarie, you will have access to a wide range of benefits such as wellbeing leave, paid parental leave, company-subsidized childcare services, volunteer leave, comprehensive medical and life insurance cover, employee assistance programs, learning and development opportunities, and flexible working arrangements. Technology plays a critical role at Macquarie, enabling every aspect of our operations and driving innovation in connecting people and data, building platforms, and designing future technology solutions. Our commitment to diversity, equity, and inclusion is unwavering, and we aim to provide reasonable adjustments to support individuals who may require assistance during the recruitment process and in their working arrangements. If you need additional support, please inform us during the application process.,

Posted 3 days ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

We are seeking experienced and talented engineers to join our team. Your main responsibilities will include designing, building, and maintaining the software that drives the global logistics industry. WiseTech Global is a leading provider of software for the logistics sector, facilitating connectivity for major companies like DHL and FedEx within their supply chains. Our organization is product and engineer-focused, with a strong commitment to enhancing the functionality and quality of our software through continuous innovation. Our primary Research and Development center in Bangalore plays a pivotal role in our growth strategies and product development roadmap. As a Lead Software Engineer, you will serve as a mentor, a leader, and an expert in your field. You should be adept at effective communication with senior management while also being hands-on with the code to deliver effective solutions. The technical environment you will work in includes technologies such as C#, Java, C++, Python, Scala, Spring, Spring Boot, Apache Spark, Hadoop, Hive, Delta Lake, Kafka, Debezium, GKE (Kubernetes Engine), Composer (Airflow), DataProc, DataStreams, DataFlow, MySQL RDBMS, MongoDB NoSQL (Atlas), UIPath, Helm, Flyway, Sterling, EDI, Redis, Elastic Search, Grafana Dashboard, and Docker. Before applying, please note that WiseTech Global may engage external service providers to assess applications. By submitting your application and personal information, you agree to WiseTech Global sharing this data with external service providers who will handle it confidentially in compliance with privacy and data protection laws.,

Posted 3 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies