Jobs
Interviews

17230 Spark Jobs - Page 12

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

0 Lacs

India

Remote

CSQ326R35 Mission The AI Forward Deployed Engineering (AI FDE) team is a highly specialized customer-facing AI team at Databricks. We deliver professional services engagements to help our customers build and productionize first-of-its-kind AI applications. We work cross-functionally to shape long-term strategic priorities and initiatives alongside engineering, product, and developer relations, as well as support internal subject matter expert (SME) teams. We view our team as an ensemble: we look for individuals with strong, unique specializations to improve the overall strength of the team. This team is the right fit for you if you love working with customers, teammates, and fueling your curiosity for the latest trends in GenAI, LLMOps, and ML more broadly. This role can be remote. The Impact You Will Have Develop cutting-edge GenAI solutions, incorporating the latest techniques from our Mosaic AI research to solve customer problems Own production rollouts of consumer and internally facing GenAI applications Serve as a trusted technical advisor to customers across a variety of domains Present at conferences such as Data + AI Summit, recognized as a thought leader internally and externally Collaborate cross-functionally with the product and engineering teams to influence priorities and shape the product roadmap What We Look For Experience building GenAI applications, including RAG, multi-agent systems, Text2SQL, fine-tuning, etc., with tools such as HuggingFace, LangChain, and DSPy Expertise in deploying production-grade GenAI applications, including evaluation and optimizations Extensive years of hands-on industry data science experience, leveraging common machine learning and data science tools, i.e. pandas, scikit-learn, PyTorch, etc. Experience building production-grade machine learning deployments on AWS, Azure, or GCP Graduate degree in a quantitative discipline (Computer Science, Engineering, Statistics, Operations Research, etc.) or equivalent practical experience Experience communicating and/or teaching technical concepts to non-technical and technical audiences alike Passion for collaboration, life-long learning, and driving business value through AI [Preferred] Experience using the Databricks Intelligence Platform and Apache Spark™ to process large-scale distributed datasets About Databricks Databricks is the data and AI company. More than 10,000 organizations worldwide — including Comcast, Condé Nast, Grammarly, and over 50% of the Fortune 500 — rely on the Databricks Data Intelligence Platform to unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark™, Delta Lake and MLflow. To learn more, follow Databricks on Twitter, LinkedIn and Facebook. Benefits At Databricks, we strive to provide comprehensive benefits and perks that meet the needs of all of our employees. For specific details on the benefits offered in your region, please visit https://www.mybenefitsnow.com/databricks. Our Commitment to Diversity and Inclusion At Databricks, we are committed to fostering a diverse and inclusive culture where everyone can excel. We take great care to ensure that our hiring practices are inclusive and meet equal employment opportunity standards. Individuals looking for employment at Databricks are considered without regard to age, color, disability, ethnicity, family or marital status, gender identity or expression, language, national origin, physical and mental ability, political affiliation, race, religion, sexual orientation, socio-economic status, veteran status, and other protected characteristics. Compliance If access to export-controlled technology or source code is required for performance of job duties, it is within Employer's discretion whether to apply for a U.S. government license for such positions, and Employer may decline to proceed with an applicant on this basis alone.

Posted 2 days ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

AB InBev GCC was incorporated in 2014 as a strategic partner for Anheuser-Busch InBev. The center leverages the power of data and analytics to drive growth for critical business functions such as operations, finance, people, and technology. The teams are transforming Operations through Tech and Analytics. Do You Dream Big? We Need You. Job Description Job Title: Senior Data Scientist Location: Bangalore Reporting to: Senior Manager Analytics 1) Purpose of the role We seek a highly skilled Senior Machine Learning Engineer / Senior Data Scientist to design, develop, and deploy advanced machine learning models and systems. The ideal candidate will have deep expertise in machine learning algorithms, data processing, and model deployment, with a proven track record of delivering scalable AI solutions in production environments. This role requires strong technical leadership, collaboration with cross-functional teams, and a passion for solving complex problems. 2) Key tasks & accountabilities Model Development: Design, develop, and optimize machine learning models for various applications, including but not limited to natural language processing, computer vision, and predictive analytics. Data Pipeline Management: Build and maintain robust data pipelines for preprocessing, feature engineering, and data augmentation to support model training and evaluation. Model Deployment: Deploy machine learning models into production environments, ensuring scalability, reliability, and performance using tools like Docker, Kubernetes, or cloud platforms preferably Azure. Research and Innovation: Stay updated on the latest advancements in machine learning and AI, incorporating state-of-the-art techniques into projects to improve performance and efficiency. Collaboration: Work closely with data scientists, software engineers, product managers, and other stakeholders to translate business requirements into technical solutions. Performance Optimization: Monitor and optimize model performance, addressing issues like model drift, bias, and scalability challenges. Code Quality: Write clean, maintainable, and well-documented code, adhering to best practices for software development and version control (e.g., Git). Mentorship: Provide technical guidance and mentorship to junior engineers, fostering a culture of learning and innovation within the team. 3) Qualifications, Experience, Skills Level of educational attainment required Bachelor’s or Master’s degree in Computer Science, Data Science, Machine Learning, or a related field. PhD is a plus. Previous work experience 5+ years of experience in machine learning, data science, or a related field. Proven experience in designing, training, and deploying machine learning models in production. Hands-on experience with cloud platforms (AWS, GCP, Azure) and containerization technologies (Docker, Kubernetes). Technical Skills required Proficiency in Python and libraries/frameworks such as TensorFlow, PyTorch, Scikit-learn, or Hugging Face. Strong understanding of machine learning algorithms (e.g., regression, classification, clustering, deep learning, reinforcement learning, optimization). Experience with big data technologies (e.g., Hadoop, Spark, or similar) and data processing pipelines. Familiarity with MLOps practices, including model versioning, monitoring, and CI/CD for ML workflows. Knowledge of software engineering principles, including object-oriented programming, API development, and microservices architecture. Other Skills required Strong problem-solving and analytical skills. Excellent communication and collaboration abilities. Ability to work in a fast-paced, dynamic environment and manage multiple priorities. Experience with generative AI models or large language models (LLMs). Familiarity with distributed computing or high-performance computing environments. And above all of this, an undying love for beer! We dream big to create future with more cheers

Posted 2 days ago

Apply

5.0 - 7.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

FactSet creates flexible, open data and software solutions for over 200,000 investment professionals worldwide, providing instant access to financial data and analytics that investors use to make crucial decisions. At FactSet, our values are the foundation of everything we do. They express how we act and operate, serve as a compass in our decision-making, and play a big role in how we treat each other, our clients, and our communities. We believe that the best ideas can come from anyone, anywhere, at any time, and that curiosity is the key to anticipating our clients’ needs and exceeding their expectations. Senior Software Engineer Group Description Data Solutions - Platforms and Environments is the industry leading content delivery platform. Clients seamlessly access organized and connected content that is easily discoverable, explorable, and procured via the FactSet Marketplace. Data is delivered via a variety of technologies and formats that meet the needs of our clients’ workflows. By enabling our clients to utilize their preferred choice of industry standard databases, programing languages, and data visualization tools, we empower them to focus on the core competencies needed to drive their business. The Data Solutions - Platforms and Environments solutions portfolio includes Standard DataFeed, Data Exploration, OnDemand (API), Views, Cornerstone, Exchange DataFeed, Benchmark Feeds, the Open:FactSet Marketplace, DataDictionary , Navigator and other non-workstation initiatives. Job Description The Data Solutions - Platforms and Environments team is looking for a talented, highly motivated Senior Software Engineer (Full Stack Engineer) to join our Navigator Application initiatives, an important part of one of FactSet’s highest profile and most strategic areas of investment and development. As the Full Stack Senior Software Engineer , you will design and develop Application Development including UI , API , Database frameworks and data engineering pipelines, help implement improvements to existing pipelines and infrastructure and provide production support. You will be collaborating closely with Product Developer/Business Analyst for capturing technical requirements. FactSet is happy to setup an information session with an Engineer working on this product to talk about the product, team and the interview process. What You’II Do Implement new components and Application Features for Client facing application as a Full Stack Developer. Maintain and resolve bugs in existing components Contribute new features, fixes, and refactors to the existing code Perform code reviews and coach engineers with respect to best practices Work with other engineers in following the test-driven methodology in an agile environment Collaborate with other engineers and Product Developers in a Scrum Agile environment using Jira and Confluence Ability to work as part of a geographically diverse team Ability to create and review documentation and test plans Estimate task sizes and regularly communicate progress in daily standups and biweekly Scrum meetings Coordinate with other teams across offices and departments What We’re Looking For Bachelor’s degree in Engineering or relevant field required. 5 to 7 years of relevant experience Expert level proficiency writing and optimizing code in Python. Proficient in frontend technologies such as Vue.js (preferred) or ReactJS and experience with JavaScript, CSS, HTML . Good knowledge of REST API Development, preferably Python Flask, Open API Good knowledge of Relational databases, preferably with MSSQL or Postgres Good Knowledge of GenAI and Vector Databases is a plus Good understanding of general database design and architecture principles A realistic, pragmatic approach. Can deliver functional prototypes that can be enhanced & optimized in later phases Strong written and verbal communication skills Working experience on AWS services, Lambda, EC2, S3, AWS Glue etc. Strong Working experience with any container / PAAS technology (Docker or Heroku) ETL and Data pipelines experience a plus. Working experience of Apache Spark, Apache Airflow, GraphQL, is a plus Experience in developing event driven distributed serverless Infrastructure (AWS-Lambda), SNS-SQS is a plus. Must be a Voracious Learner. What's In It For You At FactSet, our people are our greatest asset, and our culture is our biggest competitive advantage. Being a FactSetter means: The opportunity to join an S&P 500 company with over 45 years of sustainable growth powered by the entrepreneurial spirit of a start-up. Support for your total well-being. This includes health, life, and disability insurance, as well as retirement savings plans and a discounted employee stock purchase program, plus paid time off for holidays, family leave, and company-wide wellness days. Flexible work accommodations. We value work/life harmony and offer our employees a range of accommodations to help them achieve success both at work and in their personal lives. A global community dedicated to volunteerism and sustainability, where collaboration is always encouraged, and individuality drives solutions. Career progression planning with dedicated time each month for learning and development. Business Resource Groups open to all employees that serve as a catalyst for connection, growth, and belonging. Learn More About Our Benefits Here. Salary is just one component of our compensation package and is based on several factors including but not limited to education, work experience, and certifications. Company Overview FactSet (NYSE:FDS | NASDAQ:FDS) helps the financial community to see more, think bigger, and work better. Our digital platform and enterprise solutions deliver financial data, analytics, and open technology to more than 8,200 global clients, including over 200,000 individual users. Clients across the buy-side and sell-side, as well as wealth managers, private equity firms, and corporations, achieve more every day with our comprehensive and connected content, flexible next-generation workflow solutions, and client-centric specialized support. As a member of the S&P 500, we are committed to sustainable growth and have been recognized among the Best Places to Work in 2023 by Glassdoor as a Glassdoor Employees’ Choice Award winner. Learn more at www.factset.com and follow us on X and LinkedIn. At FactSet, we celebrate difference of thought, experience, and perspective. Qualified applicants will be considered for employment without regard to characteristics protected by law.

Posted 2 days ago

Apply

4.0 - 7.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Responsible for developing, optimize, and maintaining business intelligence and data warehouse systems, ensuring secure, efficient data storage and retrieval, enabling self-service data exploration, and supporting stakeholders with insightful reporting and analysis. Grade - T5 Please note that the Job will close at 12am on Posting Close date, so please submit your application prior to the Close Date Accountabilities What your main responsibilities are: Data Pipeline - Develop and maintain scalable data pipelines and builds out new API integrations to support continuing increases in data volume and complexity Data Integration - Connect offline and online data to continuously improve overall understanding of customer behavior and journeys for personalization. Data pre-processing including collecting, parsing, managing, analyzing and visualizing large sets of data Data Quality Management - Cleanse the data and improve data quality and readiness for analysis. Drive standards, define and implement/improve data governance strategies and enforce best practices to scale data analysis across platforms Data Transformation - Processes data by cleansing data and transforming them to proper storage structure for the purpose of querying and analysis using ETL and ELT process Data Enablement - Ensure data is accessible and useable to wider enterprise to enable a deeper and more timely understanding of operation. Qualifications & Specifications Masters /Bachelor’s degree in Engineering /Computer Science/ Math/ Statistics or equivalent. Strong programming skills in Python/Pyspark/SAS. Proven experience with large data sets and related technologies – Hadoop, Hive, Distributed computing systems, Spark optimization. Experience on cloud platforms (preferably Azure) and it's services Azure Data Factory (ADF), ADLS Storage, Azure DevOps. Hands-on experience on Databricks, Delta Lake, Workflows. Should have knowledge of DevOps process and tools like Docker, CI/CD, Kubernetes, Terraform, Octopus. Hands-on experience with SQL and data modeling to support the organization's data storage and analysis needs. Experience on any BI tool like Power BI (Good to have). Cloud migration experience (Good to have) Cloud and Data Engineering certification (Good to have) Working in an Agile environment 4-7 years of relevant work experience needed. Experience with stakeholder management will be an added advantage. What We Are Looking For Education: Bachelor's degree or equivalent in Computer Science, MIS, Mathematics, Statistics, or similar discipline. Master's degree or PhD preferred. Knowledge, Skills And Abilities Fluency in English Analytical Skills Accuracy & Attention to Detail Numerical Skills Planning & Organizing Skills Presentation Skills Data Modeling and Database Design ETL (Extract, Transform, Load) Skills Programming Skills FedEx was built on a philosophy that puts people first, one we take seriously. We are an equal opportunity/affirmative action employer and we are committed to a diverse, equitable, and inclusive workforce in which we enforce fair treatment, and provide growth opportunities for everyone. All qualified applicants will receive consideration for employment regardless of age, race, color, national origin, genetics, religion, gender, marital status, pregnancy (including childbirth or a related medical condition), physical or mental disability, or any other characteristic protected by applicable laws, regulations, and ordinances. Our Company FedEx is one of the world's largest express transportation companies and has consistently been selected as one of the top 10 World’s Most Admired Companies by "Fortune" magazine. Every day FedEx delivers for its customers with transportation and business solutions, serving more than 220 countries and territories around the globe. We can serve this global network due to our outstanding team of FedEx team members, who are tasked with making every FedEx experience outstanding. Our Philosophy The People-Service-Profit philosophy (P-S-P) describes the principles that govern every FedEx decision, policy, or activity. FedEx takes care of our people; they, in turn, deliver the impeccable service demanded by our customers, who reward us with the profitability necessary to secure our future. The essential element in making the People-Service-Profit philosophy such a positive force for the company is where we close the circle, and return these profits back into the business, and invest back in our people. Our success in the industry is attributed to our people. Through our P-S-P philosophy, we have a work environment that encourages team members to be innovative in delivering the highest possible quality of service to our customers. We care for their well-being, and value their contributions to the company. Our Culture Our culture is important for many reasons, and we intentionally bring it to life through our behaviors, actions, and activities in every part of the world. The FedEx culture and values have been a cornerstone of our success and growth since we began in the early 1970’s. While other companies can copy our systems, infrastructure, and processes, our culture makes us unique and is often a differentiating factor as we compete and grow in today’s global marketplace.

Posted 2 days ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Reference # 321767BR Job Type Full Time Your role Do you have a curious mind, want to be involved in the latest technology trends and like to solve problems that have a meaningful benefit to hundreds of users across the bank? Join our Tech Services- Group Chief Technology Office team and become a core contributor for the execution of the banks global AI Strategy, particularly to help the bank deploy AI models quickly and efficiently! We are looking for an experienced Data Engineer or ML Engineer to drive the delivery of an innovative ecosystem of tools and services. In this AI focused role, you will contribute to the development of an SDK for Data Producers across the firm to build high-quality autonomous Data Products for cross-divisional consumption and Data Consumers (e.g. Data Scientists, Quantitative Analysts, Model Developers, Model Validators and AI agents) to easily discover, access data and build AI use-cases. Responsibilities include: direct interaction with product owners and internal users to identify requirements, development of technical solutions and execution develop an SDK (Software Development Kit) to automatically capture Data Product, Dataset and AI / ML model metadata. Also, leverage LLMs to generate descriptive information about assets integration and publication of metadata into UBS's AI Use-case inventory, model artifact registry and Enterprise Data Mesh data product and dataset catalogue for discovery and regulatory compliance purposes design and implementation of services that seamlessly collects runtime evidence and operational information about a data product or model and publishes it to appropriate visualization tools creation of a collection of starters/templates that accelerate the creation of new data products by leveraging a collection of the latest tools and services and providing diverse and rich experiences to the Devpod ecosystem. design and implementation of data contract and fine-grained access mechanisms to enable data consumption on a 'need to know' basis Your team You will be part of the Data Product Framework team, which is a newly established function within Group Chief Technology Office. We provide solutions to help the firm embrace Artificial Intelligence and Machine Learning. We work with the divisions and functions of the firm to provide innovative solutions that integrate with their existing platforms to provide new and enhanced capabilities. One of our current aims is to help a data scientist get a model into production in an accelerated timeframe with the appropriate controls and security. We offer a number of key capabilities: data discovery that uses AI/ML to help users find data and obtain access a secure and controlled manner, an AI Inventory that describes the models that have been built to help users build their own use cases and validate them with Model Risk Management, a containerized model development environment for a user to experiment and produce their models and a streamlined MLOps process that helps them track their experiments and promote their models. Your expertise PHD or Master’s degree in Computer Science or any related advanced quantitative discipline 5+ years industry experience with Python / Pandas, SQL / Spark, Azure fundamentals / Kubernetes and Gitlab additional experience in data engineering frameworks (Databricks / Kedro / Flyte), ML frameworks (MLFlow / DVC) and Agentic Frameworks (Langchain, Langgraph, CrewAI) is a plus ability to produce secure and clean code that is stable, scalable, operational, and well-performing. Be up to date with the latest IT standards (security, best practices). Understanding the security principles in the banking systems is a plus ability to work independently, manage individual project priorities, deadlines and deliverables willingness to quickly learn and adopt various technologies excellent English language written and verbal communication skills About Us UBS is the world’s largest and the only truly global wealth manager. We operate through four business divisions: Global Wealth Management, Personal & Corporate Banking, Asset Management and the Investment Bank. Our global reach and the breadth of our expertise set us apart from our competitors. We have a presence in all major financial centers in more than 50 countries. How We Hire We may request you to complete one or more assessments during the application process. Learn more Join us At UBS, we know that it's our people, with their diverse skills, experiences and backgrounds, who drive our ongoing success. We’re dedicated to our craft and passionate about putting our people first, with new challenges, a supportive team, opportunities to grow and flexible working options when possible. Our inclusive culture brings out the best in our employees, wherever they are on their career journey. We also recognize that great work is never done alone. That’s why collaboration is at the heart of everything we do. Because together, we’re more than ourselves. We’re committed to disability inclusion and if you need reasonable accommodation/adjustments throughout our recruitment process, you can always contact us. Disclaimer / Policy Statements UBS is an Equal Opportunity Employer. We respect and seek to empower each individual and support the diverse cultures, perspectives, skills and experiences within our workforce.

Posted 2 days ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

Remote

Key Responsibilities: Design and develop high-performance backend services using Java (18/21) and Spring Boot Build scalable and distributed data pipelines using Apache Spark Develop and maintain microservices-based architectures Work on cloud-native deployments, preferably on AWS (EC2, S3, EMR, Lambda, etc.) Optimize data processing systems for performance, scalability, and reliability Collaborate with data engineers, architects, and product managers to translate business requirements into technical solutions Ensure code quality through unit testing, integration testing, and code reviews Troubleshoot and resolve issues in production and non-production environments Required Skills and Experience: 5+ years of professional experience in software engineering Strong programming expertise in Core Java (18/21) Hands-on experience with Apache Spark and distributed data processing Proven experience with Spring Boot and RESTful API development Solid understanding of microservices architecture and patterns Proficiency in cloud platforms, especially AWS (preferred) Experience with SQL/NoSQL databases and data lake/storage systems Familiarity with CI/CD tools and containerization (Docker/Kubernetes is a plus) What We Offer: - We offer a market-leading salary along with a comprehensive benefits package to support your well-being. -Enjoy a hybrid or remote work setup that prioritizes work-life balance and personal well-being. -We invest in your career through continuous learning and internal growth opportunities. -Be part of a dynamic, inclusive, and vibrant workplace where your contributions are recognized and rewarded. -We believe in straightforward policies, open communication, and a supportive work environment where everyone thrives. About the Company: https://predigle.com/ https://www.espergroup.com/ Predigle, an EsperGroup company, focuses on building disruptive technology platforms to transform daily business operations. Predigle has expanded rapidly to offer various products and services. Predigle Intelligence (Pi) is a comprehensive portable AI platform that offers a low-code/no-code AI design solution for solving business problems.

Posted 2 days ago

Apply

4.0 - 7.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Responsible for developing, optimize, and maintaining business intelligence and data warehouse systems, ensuring secure, efficient data storage and retrieval, enabling self-service data exploration, and supporting stakeholders with insightful reporting and analysis. Grade - T5 Please note that the Job will close at 12am on Posting Close date, so please submit your application prior to the Close Date Accountabilities What your main responsibilities are: Data Pipeline - Develop and maintain scalable data pipelines and builds out new API integrations to support continuing increases in data volume and complexity Data Integration - Connect offline and online data to continuously improve overall understanding of customer behavior and journeys for personalization. Data pre-processing including collecting, parsing, managing, analyzing and visualizing large sets of data Data Quality Management - Cleanse the data and improve data quality and readiness for analysis. Drive standards, define and implement/improve data governance strategies and enforce best practices to scale data analysis across platforms Data Transformation - Processes data by cleansing data and transforming them to proper storage structure for the purpose of querying and analysis using ETL and ELT process Data Enablement - Ensure data is accessible and useable to wider enterprise to enable a deeper and more timely understanding of operation. Qualifications & Specifications Masters /Bachelor’s degree in Engineering /Computer Science/ Math/ Statistics or equivalent. Strong programming skills in Python/Pyspark/SAS. Proven experience with large data sets and related technologies – Hadoop, Hive, Distributed computing systems, Spark optimization. Experience on cloud platforms (preferably Azure) and it's services Azure Data Factory (ADF), ADLS Storage, Azure DevOps. Hands-on experience on Databricks, Delta Lake, Workflows. Should have knowledge of DevOps process and tools like Docker, CI/CD, Kubernetes, Terraform, Octopus. Hands-on experience with SQL and data modeling to support the organization's data storage and analysis needs. Experience on any BI tool like Power BI (Good to have). Cloud migration experience (Good to have) Cloud and Data Engineering certification (Good to have) Working in an Agile environment 4-7 years of relevant work experience needed. Experience with stakeholder management will be an added advantage. What We Are Looking For Education: Bachelor's degree or equivalent in Computer Science, MIS, Mathematics, Statistics, or similar discipline. Master's degree or PhD preferred. Knowledge, Skills And Abilities Fluency in English Analytical Skills Accuracy & Attention to Detail Numerical Skills Planning & Organizing Skills Presentation Skills Data Modeling and Database Design ETL (Extract, Transform, Load) Skills Programming Skills FedEx was built on a philosophy that puts people first, one we take seriously. We are an equal opportunity/affirmative action employer and we are committed to a diverse, equitable, and inclusive workforce in which we enforce fair treatment, and provide growth opportunities for everyone. All qualified applicants will receive consideration for employment regardless of age, race, color, national origin, genetics, religion, gender, marital status, pregnancy (including childbirth or a related medical condition), physical or mental disability, or any other characteristic protected by applicable laws, regulations, and ordinances. Our Company FedEx is one of the world's largest express transportation companies and has consistently been selected as one of the top 10 World’s Most Admired Companies by "Fortune" magazine. Every day FedEx delivers for its customers with transportation and business solutions, serving more than 220 countries and territories around the globe. We can serve this global network due to our outstanding team of FedEx team members, who are tasked with making every FedEx experience outstanding. Our Philosophy The People-Service-Profit philosophy (P-S-P) describes the principles that govern every FedEx decision, policy, or activity. FedEx takes care of our people; they, in turn, deliver the impeccable service demanded by our customers, who reward us with the profitability necessary to secure our future. The essential element in making the People-Service-Profit philosophy such a positive force for the company is where we close the circle, and return these profits back into the business, and invest back in our people. Our success in the industry is attributed to our people. Through our P-S-P philosophy, we have a work environment that encourages team members to be innovative in delivering the highest possible quality of service to our customers. We care for their well-being, and value their contributions to the company. Our Culture Our culture is important for many reasons, and we intentionally bring it to life through our behaviors, actions, and activities in every part of the world. The FedEx culture and values have been a cornerstone of our success and growth since we began in the early 1970’s. While other companies can copy our systems, infrastructure, and processes, our culture makes us unique and is often a differentiating factor as we compete and grow in today’s global marketplace.

Posted 2 days ago

Apply

8.0 years

0 Lacs

India

Remote

Job Title: GCP Data Engineer Location: Remote (Only From India) Employment Type: Contract Long-Term Start Date: Immediate Time Zone Overlap: Must be available to work during EST hours (Canada) Dual Employment: Not permitted – must be terminated if applicable About the Role: We are looking for a highly skilled GCP Data Engineer to join our international team. The ideal candidate will have strong experience with Google Cloud Platform's data tools, particularly DataProc and BigQuery, and will be comfortable working in a remote, collaborative environment. You will play a key role in designing, building, and optimizing data pipelines and infrastructure that drive business insights. Key Responsibilities: Design, develop, and maintain scalable data pipelines and ETL processes on GCP Leverage GCP DataProc and BigQuery to process and analyze large volumes of data Write efficient, maintainable code using Python and SQL Develop Spark-based data workflows using PySpark Collaborate with cross-functional teams in an international environment Ensure data quality, integrity, and security Participate in code reviews and optimize system performance Required Qualifications: 5–8 years of hands-on experience in Data Engineering Proven expertise in GCP DataProc and BigQuery Strong programming skills in Python and SQL Solid experience with PySpark for distributed data processing Fluent English with excellent communication skills Ability to work independently in a remote team environment Comfortable with working during Canada EST time zone overlap Optional / Nice-to-Have Skills: Experience with additional GCP tools and services Familiarity with CI/CD for data engineering workflows Exposure to data governance and data security best practices Interview Process: Technical Test (Online screening) 15-minute HR Interview Technical Interview with 1–2 rounds Please share only if you match above JD at hiring@khey-digit.com

Posted 2 days ago

Apply

200.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Job Description Are you looking for an exciting opportunity to join a dynamic and growing team in a fast paced and challenging area? This is a unique opportunity for you to work in our team to partner with the Business to provide a comprehensive view. As Wholesale Credit Portfolio Analytics Analyst in Wholesale Credit Portfolio Analytics team, you will be responsible for creating valuable risk analytics solutions using advanced analytical frameworks and the firm's big data resources. The focus will be on leveraging data to improve the End-to-End credit risk process across the wholesale portfolio. Additionally, the role involves clearly and concisely communicating findings and insights to stakeholders. As part of Risk Management, you are at the center of keeping JPMorgan Chase strong and resilient. You help the firm grow its business in a responsible way by anticipating new and emerging risks, and using your expert judgement to solve real-world challenges that impact our company, customers and communities. Our culture in Risk Management and Compliance is all about thinking outside the box, challenging the status quo and striving to be best-in-class. Job Responsibilities Develop and maintain credit risk rating methodologies, tools, and frameworks to improve risk management processes, including counterparty rating models, exposure management, and credit approvals. Collaborate with internal model review and controls teams to ensure new methodologies are approved and compliant. Use data science techniques to derive insights and communicate findings. Contribute to generating new ideas to address both ad hoc and strategic projects. Present findings and recommendations to senior management through presentations Required Qualifications, Skills And Capabilities Relevant analytics, model/methodology development or credit risk experience Self-starter with creative problem-solving skills. Ideal candidates have experience in quantitative method development and data analysis and are comfortable discovering and communicating ideas through data. Degree in analytical field preferred (e.g., Data Science, Computer Science, Engineering, Mathematics, Statistics) Experience with modern analytic and data tools, particularly Python/Anaconda and/or R, Tensorflow and/or Keras/PyTorch, Spark, or SQL. Excellent problem solving, communications, and teamwork skills. Financial service background preferred, but not required. Desire to use modern technologies as a disruptive influence within Banking ABOUT US JPMorganChase, one of the oldest financial institutions, offers innovative financial solutions to millions of consumers, small businesses and many of the world’s most prominent corporate, institutional and government clients under the J.P. Morgan and Chase brands. Our history spans over 200 years and today we are a leader in investment banking, consumer and small business banking, commercial banking, financial transaction processing and asset management. We recognize that our people are our strength and the diverse talents they bring to our global workforce are directly linked to our success. We are an equal opportunity employer and place a high value on diversity and inclusion at our company. We do not discriminate on the basis of any protected attribute, including race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, marital or veteran status, pregnancy or disability, or any other basis protected under applicable law. We also make reasonable accommodations for applicants’ and employees’ religious practices and beliefs, as well as mental health or physical disability needs. Visit our FAQs for more information about requesting an accommodation. About The Team J.P. Morgan’s Commercial & Investment Bank is a global leader across banking, markets, securities services and payments. Corporations, governments and institutions throughout the world entrust us with their business in more than 100 countries. The Commercial & Investment Bank provides strategic advice, raises capital, manages risk and extends liquidity in markets around the world.

Posted 2 days ago

Apply

3.0 years

0 Lacs

India

On-site

Lucidworks is leading digital transformation for some of the world's biggest retailers, financial services firms, manufacturers, and B2B commerce organizations. We believe that the core to a great digital experience starts with search and browse. Our Deep Learning technology captures user behavior and utilizes machine learning to connect people with the products, content, and information they need. Brands including American Airlines, Lenovo, Red Hat, and Cisco Systems rely on Lucidworks' suite of products to power commerce, customer service, and workplace applications that delight customers and empower employees. Lucidworks believes in the power of diversity and inclusion to help us do our best work. We are an Equal Opportunity employer and welcome talent across a full range of backgrounds, orientation, origin, and identity in an inclusive and non-discriminatory way. About the Team The technical support team leverages their extensive experience supporting large-scale Solr clusters and the Lucene/Solr ecosystem. Their day might include troubleshooting errors and attempting to fix or develop workarounds, diagnosing network and environmental issues, learning your customer's infrastructure and technologies, as well as reproducing bugs and opening Jira tickets for the engineering team. Their primary tasks are break/fix scenarios where the diagnostics quickly bring network assets back online and prevent future problems--which has a huge impact on our customers’ business. About the Role As a Search Engineer in Technical Support, you will play a critical role in helping our clients achieve success with our products. You will be responsible for assisting clients directly in resolving any technical issues they encounter, as well as answering questions about the product and feature functionality. You will work closely with internal teams such as Engineering and Customer Success to resolve a variety of issues, including product defects, performance issues, and feature requests. This role requires excellent problem-solving skills and attention to detail, strong communication abilities, and a deep understanding of search technology. Additionally, this role requires the ability to work independently and as part of a team, and being comfortable working with both technical and non-technical stakeholders. The successful candidate will demonstrate a passion for delivering an outstanding customer experience, balancing technical expertise with empathy for the customer’s needs. This role is open to candidates in India. The role expected to participate in weekend on-call rotations. Responsibilities Field incoming questions, help users configure Lucidworks Fusion and its components, and help them to understand how to use the features of the product Troubleshoot complex search issues in and around Lucene/Solr Document solutions into knowledge base articles for use by our customer base in our knowledge center Identify opportunities to provide customers with additional value through follow-on products and/or services Communicate high-value use cases and customer feedback to our Product Development and Engineering teams Collaborate across teams internally to diagnose and resolve critical issues Participating in a 24/7/365 on-call rotation, which includes weekends and holidays shifts Skills & Qualifications 3+ years of hands-on experience with Lucene/Solr or other search technologies is required BS or higher in Engineering or Computer Science is preferred 3+ years professional experience in a customer facing level 2-3 tech support role Experience with technical support CRM systems (Salesforce, Zendesk etc.) Ability to clearly communicate with customers by email and phone Proficiency with Java and one or more common scripting languages (Python, Perl, Ruby, etc.) Proficiency with Unix/Linux systems (command line navigation, file system permissions, system logs and administration, scripting, networking, etc.) Exposure to other related open source projects (Mahout, Hadoop, Tika, etc.) and commercial search technologies Enterprise Search, eCommerce, and/or Business Intelligence experience Knowledge of data science and machine learning concepts Experience with cloud computing platforms (GCP, Azure, AWS, etc.) and Kubernetes Startup experience is preferred Our Stack Apache Lucene/Solr, ZooKeeper, Spark, Pulsar, Kafka, Grafana Java, Python, Linux, Kubernetes Zendesk, Jira

Posted 2 days ago

Apply

8.0 years

0 Lacs

Gurugram, Haryana, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Title-Data Engineering Lead Overall Years Of Experience-8 To 10 Years Relevant Years of Experience-4+ Data Engineering Lead Data Engineering Lead is responsible for collaborating with the Data Architect to design and implement scalable data lake architecture and data pipelines Position Summary Design and implement scalable data lake architectures using Azure Data Lake services. Develop and maintain data pipelines to ingest data from various sources. Optimize data storage and retrieval processes for efficiency and performance. Ensure data security and compliance with industry standards. Collaborate with data scientists and analysts to facilitate data accessibility. Monitor and troubleshoot data pipeline issues to ensure reliability. Document data lake designs, processes, and best practices. Experience with SQL and NoSQL databases, as well as familiarity with big data file formats like Parquet and Avro. Essential Roles and Responsibilities Must Have Skills Azure Data Lake Azure Synapse Analytics Azure Data Factory Azure DataBricks Python (PySpark, Numpy etc) SQL ETL Data warehousing Azure Devops Experience in developing streaming pipeline using Azure Event Hub, Azure Stream analytics, Spark streaming Experience in integration with business intelligence tools such as Power BI Good To Have Skills Big Data technologies (e.g., Hadoop, Spark) Data security General Skills Experience with Agile and DevOps methodologies and the software development lifecycle. Proactive and responsible for deliverables Escalates dependencies and risks Works with most DevOps tools, with limited supervision Completion of assigned tasks on time and regular status reporting Should be able to train new team members Desired to have knowledge on any of the cloud solutions such as Azure or AWS with DevOps/Cloud certifications. Should be able to work with a multi culture global teams and team virtually Should be able to build strong relationship with project stakeholders EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 2 days ago

Apply

0 years

0 Lacs

India

On-site

Hadoop Admin Location - Bangalore ( 1st priority) / Pune / Chennai Interview Mode - Level 1 or 2 will be F2F discussion Experience - 7+ Yrs Regular Shift - 9 AM to 6 PM JOB SUMMARY: 1) Strong expertise in Install, configure, and maintain Hadoop ecosystem components (HDFS, YARN, Hive, HBase, Spark, Oozie, Zookeeper, etc.). 2) Monitor cluster performance and capacity; troubleshoot and resolve issues proactively. 3) Manage cluster upgrades, patching, and security updates with minimal downtime. 5) Implement and maintain data security, authorization, and authentication (Kerberos, Ranger, or Sentry). 6) Configure and manage Hadoop high availability, disaster recovery, and backup strategies. 7) Automate cluster monitoring, alerting, and performance tuning. 8) Work closely with data engineering teams to ensure smooth data pipeline operations. 9) Perform root cause analysis for recurring system issues and implement permanent fixes. 10) Develop and maintain system documentation, including runbooks and SOPs. 11) Support integration with third-party tools (Sqoop, Flume, Kafka, Airflow, etc.). 12) Participate in on-call rotation and incident management for production support.

Posted 2 days ago

Apply

0 years

0 Lacs

India

On-site

The ideal candidate will be responsible for developing high-quality applications. They will also be responsible for designing and implementing testable and scalable code. Responsibilities: Lead backend Python development for innovative healthcare technology solutions Oversee a backend team to achieve product and platform goals in the B2B HealthTech domain Design and implement scalable backend infrastructures with seamless API integration Ensure availability on immediate or short notice for efficient onboarding and project ramp-up Optimize existing backend systems based on real-time healthcare data requirements Collaborate with cross-functional teams to ensure alignment between tech and business goals Review and refine code for quality, scalability, and performance improvements Ideal Candidate: Experienced in building B2B software products using agile methodologies Strong proficiency in Python, with a deep understanding of backend system architecture Comfortable with fast-paced environments and quick onboarding cycles Strong communicator who fosters a culture of innovation, ownership, and collaboration Passionate about driving real-world healthcare impact through technology Skills Required: Primary: TypeScript, AWS, Python, RESTful APIs, Backend Architecture Additional: SQL/NoSQL databases, Docker/Kubernetes (preferred) Strongly Good to Have: Prior experience in Data Engineering , especially in healthcare or real-time analytics Familiarity with ETL pipelines , data lake/warehouse solutions , and stream processing frameworks (e.g., Apache Kafka, Spark, Airflow) Understanding of data privacy, compliance (e.g., HIPAA) , and secure data handling practices Hiring Process Profile Shortlisting Tech Interview Tech Interview Culture Fit

Posted 2 days ago

Apply

5.0 years

0 Lacs

India

On-site

About Nacre Capital: Nacre Capital is a global venture builder specialized in creating, building and growing disruptive start-ups with deep technologies that significantly impact lives. We are an international team of entrepreneurs, business leaders and experts including - pioneering scientists, renowned technologists, researchers, growth experts and thought leaders - that together develop and transform ventures into world-class disruptive market-leading companies. About the Role: We're looking for a highly skilled Cold Calling Specialist to join our dynamic team. This isn't just a cold calling role; it's a mission-critical position for a true hunter who thrives on initiating conversations and opening doors. Your primary focus will be outbound cold calling to meticulously identified prospects, with a hint of research to refine your targeting and approach. If you possess an unparalleled ability to engage, qualify, and inspire interest from scratch, and you're driven by measurable success, we want to hear from you. This is an opportunity for a top performer to be the spearhead of our sales pipeline, solely dedicated to creating new opportunities and fueling our growth. Key Responsibilities: Own the Outbound: Execute high-volume cold calling campaigns to a targeted list of prospects, consistently exceeding daily and weekly call metrics Master the Art of the Opening: Skillfully navigate gatekeepers and objections to connect directly with decision-makers and key influencers Qualify with Precision: Conduct initial discovery calls to understand prospect needs, pain points, and current solutions, effectively qualifying leads based on established criteria Spark Interest: Articulate our value proposition clearly and concisely, generating genuine interest and securing next steps, such as discovery calls or demonstrations for our sales team Strategic Research: Conduct brief, targeted research on companies and contacts prior to calls to personalize your approach and increase engagement rates CRM Excellence: Accurately log all call activities, interactions, and relevant information in our CRM system (e.g., Salesforce, HubSpot) to maintain a clean and up-to-date pipeline Iterate & Improve: Actively participate in feedback sessions, sharing insights from calls to refine scripts, objection handling techniques, and overall strategy Collaborate for Success: Work closely with the sales team to ensure seamless handoffs of qualified opportunities Requirements Proven Cold Calling Prowess: 5+ years of extensive track record of success in purely cold calling roles, consistently hitting and exceeding targets. This isn't your first rodeo; you're a seasoned pro Exceptional Communication: Impeccable verbal communication skills with a clear, confident, and persuasive phone presence. You can think on your feet and adapt your message in real-time Resilience & Grit: A high tolerance for rejection and an unwavering determination to succeed. You see "no" as an opportunity to learn and refine your approach Active Listening: The ability to genuinely listen to prospects, uncover their needs, and tailor your conversation accordingly Self-Motivated & Independent: You thrive in an autonomous environment and are driven by personal and team achievements, requiring minimal oversight Quick Learner: Ability to rapidly grasp complex product/service information and articulate it effectively to diverse audiences CRM Savvy: Proficient with CRM software for logging activities, managing contacts, and tracking progress Bonus Points: Experience in a similar industry or with selling a comparable product/service Essential Skills & Traits: Exceptional Communication: Impeccable verbal communication skills with a clear, confident, and persuasive phone presence. You can think on your feet, adapt your message in real-time, and articulate complex technological benefits in an understandable way Clear & Understandable Accent: A neutral or easily understandable accent, ideally American English or a Western European English accent, is crucial for effective communication with our diverse international client base. Benefits Impactful Role: Be the frontline of our growth, directly contributing to our success by generating high-quality leads Performance-Driven Culture: Join a team that values results, offers clear objectives, and recognizes top performance Uncapped Potential: Opportunities for significant earning potential based on your success Focus on Your Strength: Dedicate yourself entirely to what you do best - cold calling - without the distractions of full-cycle sales

Posted 2 days ago

Apply

8.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Title-Data Engineering Lead Overall Years Of Experience-8 To 10 Years Relevant Years of Experience-4+ Data Engineering Lead Data Engineering Lead is responsible for collaborating with the Data Architect to design and implement scalable data lake architecture and data pipelines Position Summary Design and implement scalable data lake architectures using Azure Data Lake services. Develop and maintain data pipelines to ingest data from various sources. Optimize data storage and retrieval processes for efficiency and performance. Ensure data security and compliance with industry standards. Collaborate with data scientists and analysts to facilitate data accessibility. Monitor and troubleshoot data pipeline issues to ensure reliability. Document data lake designs, processes, and best practices. Experience with SQL and NoSQL databases, as well as familiarity with big data file formats like Parquet and Avro. Essential Roles and Responsibilities Must Have Skills Azure Data Lake Azure Synapse Analytics Azure Data Factory Azure DataBricks Python (PySpark, Numpy etc) SQL ETL Data warehousing Azure Devops Experience in developing streaming pipeline using Azure Event Hub, Azure Stream analytics, Spark streaming Experience in integration with business intelligence tools such as Power BI Good To Have Skills Big Data technologies (e.g., Hadoop, Spark) Data security General Skills Experience with Agile and DevOps methodologies and the software development lifecycle. Proactive and responsible for deliverables Escalates dependencies and risks Works with most DevOps tools, with limited supervision Completion of assigned tasks on time and regular status reporting Should be able to train new team members Desired to have knowledge on any of the cloud solutions such as Azure or AWS with DevOps/Cloud certifications. Should be able to work with a multi culture global teams and team virtually Should be able to build strong relationship with project stakeholders EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 2 days ago

Apply

8.0 years

0 Lacs

Kochi, Kerala, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Title-Data Engineering Lead Overall Years Of Experience-8 To 10 Years Relevant Years of Experience-4+ Data Engineering Lead Data Engineering Lead is responsible for collaborating with the Data Architect to design and implement scalable data lake architecture and data pipelines Position Summary Design and implement scalable data lake architectures using Azure Data Lake services. Develop and maintain data pipelines to ingest data from various sources. Optimize data storage and retrieval processes for efficiency and performance. Ensure data security and compliance with industry standards. Collaborate with data scientists and analysts to facilitate data accessibility. Monitor and troubleshoot data pipeline issues to ensure reliability. Document data lake designs, processes, and best practices. Experience with SQL and NoSQL databases, as well as familiarity with big data file formats like Parquet and Avro. Essential Roles and Responsibilities Must Have Skills Azure Data Lake Azure Synapse Analytics Azure Data Factory Azure DataBricks Python (PySpark, Numpy etc) SQL ETL Data warehousing Azure Devops Experience in developing streaming pipeline using Azure Event Hub, Azure Stream analytics, Spark streaming Experience in integration with business intelligence tools such as Power BI Good To Have Skills Big Data technologies (e.g., Hadoop, Spark) Data security General Skills Experience with Agile and DevOps methodologies and the software development lifecycle. Proactive and responsible for deliverables Escalates dependencies and risks Works with most DevOps tools, with limited supervision Completion of assigned tasks on time and regular status reporting Should be able to train new team members Desired to have knowledge on any of the cloud solutions such as Azure or AWS with DevOps/Cloud certifications. Should be able to work with a multi culture global teams and team virtually Should be able to build strong relationship with project stakeholders EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 2 days ago

Apply

0 years

0 Lacs

Haryana, India

On-site

About Adda Education At Adda Education , we are committed to transforming the way educational content is created and delivered. As one of India’s leading EdTech platforms, we build engaging, curriculum-aligned learning experiences that help millions of students across the country. Our focus is on creating high-impact content—smartly structured, creatively delivered, and designed for scale across platforms. About The Role We’re looking for a dynamic and driven Content Writer Intern who’s excited about AI, education, and storytelling . We're seeking passionate AI Content writers with a creative spark and an AI-first mindset to own and elevate our AI-driven content creation. This role is ideal for someone who wants to blend creativity with technology , using generative AI tools and prompt engineering to craft impactful educational content.. Work with our team to scale impactful learning experiences and help shape the future of AI-powered education at Adda. What You’ll Do- Own the AI Content Workflow: Take full ownership of the end-to-end process for generating, verifying, and optimizing AI-powered content using our proprietary AI Content Engine & Verification Platform. Write clear, compelling, and structured educational content across subjects. Leverage LLM tools (like ChatGPT, Claude, Gemini) to ideate and accelerate writing. Collaborate with content & AI teams to develop scripts, blogs, quiz-based content, and more. Experiment with prompt writing and contribute to our AI-content workflows. Assist in scaling content campaigns that engage and educate. Bridge Consumption and Generation Gaps: Develop AI-first content writing to scale content production 10x or more, drawing insights from industry benchmarks to fuel user growth and competitiveness. Qualifications:- Creative Lens: Strong sense of what constitutes high-engagement content (e.g. on social media, blogs, videos). Content Creation Experience: Demonstrated experience in content generation—either through professional work, personal projects, or as an active content creator/influencer on platforms like YouTube, Instagram, or LinkedIn. This ensures you deeply understand audience psychology, content virality, and storytelling. Analytical Thinker: Ability to spot trends, analyze gaps in content ecosystems, and apply insights to drive innovation. Basic understanding of Generative AI tools and content automation. Familiarity or interest in prompt engineering and workflow tools like Notion/Google Docs. Creative thinking and a knack for presenting complex topics in simple language. Self-starters who are eager to learn, iterate, and innovate. Female candidates are highly welcome. Apply Now : https://docs.google.com/forms/d/1NYBrjuLbCAOw7_puEiEo8NROZvh1eqJuqfuCAJsGGQc/edit

Posted 2 days ago

Apply

3.0 - 6.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

We’re looking for a Brand Manager who can craft and execute a compelling brand narrative across offline and online channels. You’ll lead a dynamic team of 5-6 professionals and own everything from large-scale campaigns to reputation management. If you thrive on building brands that spark curiosity and engagement, this role is for you. About the Role Key Responsibilities Brand Strategy & Execution: Develop and execute brand campaigns that enhance PST’s presence both online and offline. Online & Offline Campaigns: Drive integrated marketing campaigns across digital, print, and on-ground activations. Social Media & Digital Presence: Oversee content, engagement, and growth on key platforms. Social Listening & ORM: Monitor brand sentiment, respond to feedback, and manage online reputation. Seeding & Influencer Marketing: Identify and collaborate with industry voices, communities, and influencers. Event Management: Plan and execute industry events, webinars, and campus experiences. Public Relations: Drive media coverage, press releases, and thought leadership. Team Leadership: Mentor and guide a team of 5-6 marketers. Qualifications 3-6 years of experience in brand management, marketing, or a related field. Required Skills Proven track record in executing online and offline campaigns. Strong grasp of social media trends, community building, and ORM. Experience in PR, influencer marketing, and event execution. Data-driven mindset with creativity to craft compelling brand narratives. Excellent leadership, communication, and stakeholder management skills.

Posted 2 days ago

Apply

6.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

The Security Engineering team at Uber is focused on making the production and corporate environments secure by default to provide industry-leading solutions for Uber's production services and infrastructure. As a Senior Software Engineer in the Enterprise Application Security team, you will leverage your solid software engineering background in building solutions to ensure the protection of enterprise applications from evolving cyber threats. You will be responsible for designing, implementing, and maintaining advanced security solutions to strengthen Uber's security posture by securing various enterprise applications. What The Candidate Will Need / Bonus Points ---- What the Candidate Will Do ---- Design and implement secure architectures for enterprise applications, ensuring industry best practices. Hands-on coding and code reviews Build, deploy, configure, and manage a variety of enterprise security solutions, including the Security Posture Management platform. Monitor, analyze, and remediate security risks across enterprise applications. Provide technical and engineering support to security and IT teams performing regular security assessments. Basic Qualifications Bachelor's or Master's degree in Computer Science, Engineering, or related field. 6+ years of experience in software engineering Strong hands-on technical skills, including proficiency in one or more programming languages (Go, Java, Python, C/C++) and code reviews Deep understanding of software engineering fundamentals, including algorithms, data structures, system design, and architecture. Excellent analytical and problem-solving skills, with the ability to assess security risks and implement effective solutions. Passion for innovation Preferred Qualifications Knowledge of cybersecurity concepts, tools, and best practices. Certifications in Security is a plus Experience with AI/ML technologies / frameworks and incorporating them into production systems Experience with SQL, Kafka and databases (including Spark, Hive, SQL, No-SQL etc) Experience with modern development practices (e.g., CI/CD, microservices, cloud platforms like AWS/GCP). Experience in building out integrations with open-source and vendor products Experience with automation and scripting (e.g., Python, Bash) for security operations

Posted 2 days ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

ABOUT TRIBUTE TECHNOLOGY: At Tribute Technology, we make end-of-life celebrations memorable, meaningful, and effortless through thoughtful and innovative technology solutions. Our mission is to help communities around the world celebrate life and pay tribute to those we love. Our comprehensive platform brings together software and technology to provide a fully integrated experience for all users, whether that is a family, a funeral home, or an online publisher. We are the market leader in the US and Canada, with global expansion plans and a growing international team of more than 400 individuals in the US, Canada, Philippines, Ukraine and India. ABOUT YOU: Tribute Technology is actively seeking a motivated an experienced Director, Data Engineering with expertise in AWS and GCP to join our Data & Analytics team. Our Data & Analytics team drives innovation and excellence by harnessing data to deliver actionable insights across the organization. As Director, you will play a crucial role in leading and enhancing our data pipelines, processes and visualizations as well as selecting fit for purpose tools to support our growing business needs. This is an exciting opportunity for a data enthusiast with a strong record of success in managing complex data initiatives. If you have a strong understanding of AWS and GCP and a passion for leading and developing a team, we want to hear from you. Join us and be a part of our innovative and collaborative environment where your contributions will have a significant impact. KEY RESPONSIBILITIES: Lead and manage a team of data engineers to ensure timely delivery of data solutions. Collaborate with cross-functional teams to understand business requirements and translate them into data engineering solutions. Design and implement data pipelines and processes to integrate data from various sources into our data ecosystem. Improve data observability and ensure the team follows best practices. Ensure timely resolution for data pipeline issues implementing a monitoring and incident response strategy. Implement best practices for data governance, ensuring high data quality, consistency, and security across the data pipelines. Foster a collaborative environment, ensuring the team works cross-functionally with business stakeholders to deliver projects effectively. Collaborate with other technology leaders to ensure the data environment is reliable and resilient, adapting over time to serve as a consistent backbone for data-driven solutions to support business objectives. Create documentation of the pipelines, maintain data asset documentation and data product documentation to ensure visibility and discoverability of our data products. Mentor and coach team members to enhance their technical skills and promote a culture of continuous learning and development. Review and weigh in on functional and technical requirements. Participate in strategic planning and budgeting for data engineering activities. EDUCATION AND/OR EXPERIENCE: Bachelor’s degree in computer science, Engineering or relevant field of study with minimum 8 years of progressively responsible technology experience with 3+ years as a data engineering lead. Prior experience in an eCommerce business with high volume data (:1B sessions annually) and integrating data across multiple business units. Strong knowledge of data engineering technologies such as Python, SQL, Spark, real time streaming pipelines, and data warehouse solutions. Proven data engineering experience in cloud environments. Proven experience leading teams, fostering a high-performance culture, and managing distributed teams is preferred. Strong people management skills, including recruitment, performance management, and fostering growth and innovation within the team. Experience in creating data products and/or data product environment. BENEFITS: Competitive salary Fully remote across India An outstanding collaborative work environment Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions of the position

Posted 2 days ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

At Seismic, we're proud of our engineering culture where technical excellence and innovation drive everything we do. We're a remote-first data engineering team responsible for the critical data pipeline that powers insights for over 2,300 customers worldwide. Our team manages all data ingestion processes, leveraging technologies like Apache Kafka, Spark, various C# microservices services, and a shift-left data mesh architecture to transform diverse data streams into the valuable reporting models that our customers rely on daily to make data-driven decisions. Additionally, we're evolving our analytics platform to include AI-powered agentic workflows. Who You Are Have working knowledge of one OO language, preferably C#, but won’t hold your Java expertise against you (you’re the type of person who’s interested in learning and becoming an expert at new things). Additionally, we’ve been using Python more and more, and bonus points if you’re familiar with Scala. Have experience with architecturally complex distributed systems. Highly focused on operational excellence and quality – you have a passion to write clean and well tested code and believe in the testing pyramid. Outstanding verbal and written communication skills with the ability to work with others at all levels, effective at working with geographically remote and culturally diverse teams. You enjoy solving challenging problems, all while having a blast with equally passionate team members. Conversant in AI engineering. You’ve been experimenting with building ai solutions/integrations using LLMs, prompts, Copilots, Agentic ReAct workflows, etc. At Seismic, we’re committed to providing benefits and perks for the whole self. To explore our benefits available in each country, please visit the Global Benefits page. Please be aware we have noticed an increase in hiring scams potentially targeting Seismic candidates. Read our full statement on our Careers page. Seismic is the global leader in AI-powered enablement, empowering go-to-market leaders to drive strategic growth and deliver exceptional customer experiences at scale. The Seismic Enablement Cloud™ is the only unified AI-powered platform that prepares customer-facing teams with the skills, content, tools, and insights needed to maximize every buyer interaction and strengthen client relationships. Trusted by more than 2,000 organizations worldwide, Seismic helps businesses achieve measurable outcomes and accelerate revenue growth. Seismic is headquartered in San Diego with offices across North America, Europe, Asia and Australia. Learn more at seismic.com. Seismic is committed to building an inclusive workplace that ignites growth for our employees and creates a culture of belonging that allows all employees to be seen and valued for who they are. Learn more about DEI at Seismic here. Collaborating with experienced software engineers, data scientists and product managers to rapidly build, test, and deploy code to create innovative solutions and add value to our customers' experience. Building large scale platform infrastructure and REST APIs serving machine learning driven content recommendations to Seismic products. Leveraging the power of context in third-party applications such as CRMs to drive machine learning algorithms and models. Helping build next-gen Agentic tooling for reporting and insights Processing large amounts of internal and external system data for analytics, caching, modeling and more. Identifying performance bottlenecks and implementing solutions for them. Participating in code reviews, system design reviews, agile ceremonies, bug triage and on-call rotations. BS or MS in Computer Science, similar technical field of study, or equivalent practical experience. 3+ years of software development experience within a SaaS business. Must have a familiarity with .NET Core, and C# and frameworks. Experience in data engineering - building and managing Data Pipelines, ETL processes, and familiarity with various technologies that drive them: Kafka, FiveTran (Optional), Spark/Scala (Optional), etc. Data warehouse experience with Snowflake, or similar (AWS Redshift, Apache Iceberg, Clickhouse, etc). Familiarity with RESTFul microservice-based APIs Experience in modern CI/CD pipelines and infrastructure (Jenkins, Github Actions, Terraform, Kubernetes) a big plu (or equivalent) Experience with the SCRUM and the AGILE development process. Familiarity developing in cloud-based environments Optional: Experience with 3rd party integrations Optional: familiarity with Meeting systems like Zoom, WebEx, MS Teams Optional: familiarity with CRM systems like Salesforce, Microsoft Dynamics 365, Hubspot. If you are an individual with a disability and would like to request a reasonable accommodation as part of the application or recruiting process, please click here. Headquartered in San Diego and with employees across the globe, Seismic is the global leader in sales enablement , backed by firms such as Permira, Ameriprise Financial, EDBI, Lightspeed Venture Partners, and T. Rowe Price. Seismic also expanded its team and product portfolio with the strategic acquisitions of SAVO, Percolate, Grapevine6, and Lessonly. Our board of directors is composed of several industry luminaries including John Thompson, former Chairman of the Board for Microsoft. Seismic is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to gender, age, race, religion, or any other classification which is protected by applicable law. Please note this job description is not designed to cover or contain a comprehensive listing of activities, duties or responsibilities that are required of the employee for this job. Duties, responsibilities and activities may change at any time with or without notice.

Posted 2 days ago

Apply

7.0 - 9.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Who We Are Boston Consulting Group partners with leaders in business and society to tackle their most important challenges and capture their greatest opportunities. BCG was the pioneer in business strategy when it was founded in 1963. Today, we help clients with total transformation-inspiring complex change, enabling organizations to grow, building competitive advantage, and driving bottom-line impact. To succeed, organizations must blend digital and human capabilities. Our diverse, global teams bring deep industry and functional expertise and a range of perspectives to spark change. BCG delivers solutions through leading-edge management consulting along with technology and design, corporate and digital ventures—and business purpose. We work in a uniquely collaborative model across the firm and throughout all levels of the client organization, generating results that allow our clients to thrive. What You'll Do As a part of BCG's X team, you will work closely with consulting teams on a diverse range of advanced topics. You will have the opportunity to leverage software development methodologies to deliver value to BCG's Consulting & X (case) teams, X Product teams and Practice Areas (domain) through providing software developer subject matter expertise and accelerated execution support. You will collaborate with teams to gather requirements, specify, design, develop, deliver, and support software solutions serving client needs. You will provide technical support through deeper understanding of relevant software solutions and processes to build high quality and efficient technology solutions. Assignments will range from short term Proof of concepts/Minimum viable product to long term cases with enterprise grade software development as a critical enabler through the project level description of the role responsibilities and impact within the organization. YOU'RE GOOD AT Expert in .net core, .net framework, C#, MVC, WebAPI, REST & SQL Server Good understanding of software architecture, object-oriented programming, and design patterns. Familiarity with agile development methodologies and version control systems (e.g., Git). Proficiency in database systems such as SQL Server. Excellent problem-solving skills and the ability to work well in a collaborative team environment. Strong knowledge of ORM frameworks, Experience with advanced JavaScript frameworks such as Angular / React Understanding of fundamental design principles behind a scalable application Ability to understand business requirements and translate them into technical requirements Familiarity with Docker & Kubernetes End to end ownership with excellent analytical and communication skills. What You'll Bring Bachelor's or Master's degree in computer science, Information Technology, or a related field 7 - 9 years of relevant experience in software engineering Experience developing in multiple tech stacks Experience creating and using web APIs* Experience with developing one or more of AWS, Azure and GCP cloud environments. Full stack application development Understanding the nature of asynchronous programming and its quirks and workarounds* Outstanding interpersonal and communication skills to interact with internal and external stakeholders while working in a global collaborative team environment. Experience with developing one or more of AWS, Azure and GCP cloud environments. Full stack application development Understanding the nature of asynchronous programming and its quirks and workarounds Outstanding interpersonal and communication skills to interact with internal and external stakeholders while working in a global collaborative team environment. #BCGXjob Boston Consulting Group is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, age, religion, sex, sexual orientation, gender identity / expression, national origin, disability, protected veteran status, or any other characteristic protected under national, provincial, or local law, where applicable, and those with criminal histories will be considered in a manner consistent with applicable state and local laws. BCG is an E - Verify Employer. Click here for more information on E-Verify.

Posted 2 days ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Your Future Evolves Here Evolent Health has a bold mission to change the health of the nation by changing the way health care is delivered. Our pursuit of this mission is the driving force that brings us to work each day. We believe in embracing new ideas, challenging ourselves and failing forward. We respect and celebrate individual talents and team wins. We have fun while working hard and Evolenteers often make a difference working in everything from scrubs to jeans. Are we growing? Absolutely and Globally. In 2021 we grew our teams by almost 50% and continue to grow even more in 2022. Are we recognized as a company you are supported by for your career and growth, and a great place to work? Definitely. Evolent Health International (Pune, India) has been certified as “Great Places to Work” in 2021. In 2020 and 2021 Evolent in the U.S. was both named Best Company for Women to Advance list by Parity.org and earned a perfect score on the Human Rights Campaign (HRC) Foundation’s Corporate Equality Index (CEI). This index is the nation's foremost benchmarking survey and report measuring corporate policies and practices related to LGBTQ+ workplace equality. We recognize employees that live our values, give back to our communities each year, and are champions for bringing our whole selves to work each day. If you’re looking for a place where your work can be personally and professionally rewarding, don’t just join a company with a mission. Join a mission with a company behind it. What You’ll Be Doing: Job Summary Design and develop BI reporting and data platforms. Creates the development of user-facing data visualization and presentation tools, including Microsoft SQL Server Reporting Services (SSRS) reports, Power BI dashboards, MicroStrategy and Excel PivotTables. Work on the development of data retrieval and data management for Evolent Health. Responsible for ensuring that the data assets of an organization are aligned with the organization in achieving its strategic goals. The architecture should cover databases, data integration and the means to get to the data. Help implement effective business analytics practices to enhance decision-making, efficiency, and performance. Assist with technology improvements to ensure continuous enhancements of the core BI platform. Data Analysis: Ability to perform complex data analysis using advanced SQL skills and Excel to support internal /external client’s data requests and queries for ad-hoc requests for business continuity and analytics. Communicate with non-technical business users to gather specific requirements for reports and BI solutions. Provide maintenance support for existing BI applications and reports Present work when requested and participate in knowledge-sharing sessions with team members. Required Qualifications 3-5 years of experience in BI/ Data Warehouse domain developing BI solutions and data analysis tasks using MSBI suites. Strong proficiency in Power BI: building reports, dashboards, DAX, and Power Query (M). Experience with Microsoft Fabric, including Lakehouse, Dataflows Gen2, and Direct Lake capabilities, Power Automate. Experience with Azure Data Services: Azure Data Factory, Azure Synapse, Azure Data Lake, or similar. Hands-on experience with SQL Server Reporting Services (SSRS) and SQL Server Integration Services (SSIS). Knowledge of Advanced SQL for data manipulation and performance tuning. Experience implementing ETL/ELT pipelines. Ability to work with both relational and cloud-based data sources. Preferred Qualifications Healthcare industry experience with exposure to authorizations/claims/eligibility and patient clinical data Experience with Python, Spark, or Databricks for data engineering or transformation. Familiarity with DevOps/GitRepo for BI, including deployment automation and CI/CD in Azure DevOps. Understanding of data governance, security models, and compliance. Experience with semantic modeling in Power BI and/or tabular models using Analysis Services. Exposure to AI and machine learning integrations within Microsoft Fabric or Azure. Experience with Power Apps, Microsoft purview Mandatory Requirements: Employees must have a high-speed broadband internet connection with a minimum speed of 50 Mbps and the ability to set up a wired connection to their home network to ensure effective remote work. These requirements may be updated as needed by the business. Evolent Health is an equal opportunity employer and considers all qualified applicants equally without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, or disability status .

Posted 2 days ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Assoicate AIML Engineer– Global Data Analytics, Technology (Maersk) This position will be based in India – Bangalore/Pune A.P. Moller - Maersk A.P. Moller – Maersk is the global leader in container shipping services. The business operates in 130 countries and employs 80,000 staff. An integrated container logistics company, Maersk aims to connect and simplify its customers’ supply chains. Today, we have more than 180 nationalities represented in our workforce across 131 Countries and this mean, we have elevated level of responsibility to continue to build inclusive workforce that is truly representative of our customers and their customers and our vendor partners too. We are responsible for moving 20 % of global trade & is on a mission to become the Global Integrator of Container Logistics. To achieve this, we are transforming into an industrial digital giant by combining our assets across air, land, ocean, and ports with our growing portfolio of digital assets to connect and simplify our customer’s supply chain through global end-to-end solutions, all the while rethinking the way we engage with customers and partners. The Brief In this role as an Associate AIML Engineer on the Global Data and Analytics (GDA) team, you will support the development of strategic, visibility-driven recommendation systems that serve both internal stakeholders and external customers. This initiative aims to deliver actionable insights that enhance supply chain execution, support strategic decision-making, and enable innovative service offerings. Data AI/ML (Artificial Intelligence and Machine Learning) Engineering involves the use of algorithms and statistical models to enable systems to analyse data, learn patterns, and make data-driven predictions or decisions without explicit human programming. AI/ML applications leverage vast amounts of data to identify insights, automate processes, and solve complex problems across a wide range of fields, including healthcare, finance, e-commerce, and more. AI/ML processes transform raw data into actionable intelligence, enabling automation, predictive analytics, and intelligent solutions. Data AI/ML combines advanced statistical modelling, computational power, and data engineering to build intelligent systems that can learn, adapt, and automate decisions. What I'll be doing – your accountabilities? Build and maintain machine learning models for various applications, such as natural language processing, computer vision, and recommendation systems Perform exploratory data analysis (EDA) to identify patterns and trends in data Clean, preprocess, perform hyperparameter tuning and analyze large datasets to prepare them for AI/ML model training Build, test, and optimize machine learning models and experiment with algorithms and frameworks to improve model performance Use programming languages, machine learning frameworks and libraries, algorithms, data structures, statistics and databases to optimize and fine-tune machine learning models to ensure scalability and efficiency Learn to define user requirements and align solutions with business needs Work on AI/ML engineering projects, perform feature engineering and collaborate with teams to understand business problems Learn best practices in data / AI/ML engineering and performance optimization Contribute to research papers and technical documentation Contribute to project documentation and maintain data quality standards Foundational Skills Understands Programming skills beyond the fundamentals and can demonstrate this skill in most situations without guidance. Understands the below skills beyond the fundamentals and can demonstrate in most situations without guidance AI & Machine Learning Data Analysis Machine Learning Pipelines Model Deployment Specialized Skills To be able to understand beyond the fundamentals and can demonstrate in most situations without guidance for the following skills: Deep Learning Statistical Analysis Data Engineering Big Data Technologies Natural Language Processing (NPL) Data Architecture Data Processing Frameworks Proficiency in Python programming. Proficiency in Python-based statistical analysis and data visualization tool While having limited understanding of Technical Documentation but are focused on growing this skill Qualifications & Requirements BSc/MSc/PhD in computer science, data science or related discipline with 5+ years of industry experience building cloud-based ML solutions for production at scale, including solution architecture and solution design experience Good problem solving skills, for both technical and non-technical domains Good broad understanding of ML and statistics covering standard ML for regression and classification, forecasting and time-series modeling, deep learning 3+ years of hands-on experience building ML solutions in Python, incl knowledge of common python data science libraries (e.g. scikit-learn, PyTorch, etc) Hands-on experience building end-to-end data products based on AI/ML technologies Some experience with scenario simulations. Experience with collaborative development workflow: version control (we use github), code reviews, DevOps (incl automated testing), CI/CD Team player, eager to collaborate and good collaborator Preferred Experiences In addition to basic qualifications, would be great if you have… Hands-on experience with common OR solvers such as Gurobi Experience with a common dashboarding technology (we use PowerBI) or web-based frontend such as Dash, Streamlit, etc. Experience working in cross-functional product engineering teams following agile development methodologies (scrum/Kanban/…) Experience with Spark and distributed computing Strong hands-on experience with MLOps solutions, including open-source solutions. Experience with cloud-based orchestration technologies, e.g. Airflow, KubeFlow, etc Experience with containerization (Kubernetes & Docker) As a performance-oriented company, we strive to always recruit the best person for the job – regardless of gender, age, nationality, sexual orientation or religious beliefs. We are proud of our diversity and see it as a genuine source of strength for building high-performing teams. Maersk is committed to a diverse and inclusive workplace, and we embrace different styles of thinking. Maersk is an equal opportunities employer and welcomes applicants without regard to race, colour, gender, sex, age, religion, creed, national origin, ancestry, citizenship, marital status, sexual orientation, physical or mental disability, medical condition, pregnancy or parental leave, veteran status, gender identity, genetic information, or any other characteristic protected by applicable law. We will consider qualified applicants with criminal histories in a manner consistent with all legal requirements. We are happy to support your need for any adjustments during the application and hiring process. If you need special assistance or an accommodation to use our website, apply for a position, or to perform a job, please contact us by emailing accommodationrequests@maersk.com.

Posted 2 days ago

Apply

8.0 - 15.0 years

0 Lacs

Gurugram, Haryana, India

On-site

All Levels – Comms & Media – Non-Networks Join our team in Strategy & Consulting Global Network to find endless opportunities and solve our clients' toughest challenges, as you work with exceptional people, the latest tech and leading companies across industries. Practice: Comms & Media, Industry Consulting, S&C Global Network (GN) I Areas of Work: Non-Networks | Level: Manager & Sr Manager | Location: Delhi, Gurgaon, Mumbai, Bangalore, Pune, Hyderabad | Years of Exp: 8- 15 years Explore an Exciting Career at Accenture Are you an outcome-oriented problem solver? Do you enjoy working on transformation strategies for global clients? Does working in an inclusive and collaborative environment spark your interest? Then, Accenture Strategy and Consulting is the right place for you to explore limitless possibilities. Comms & Media (C&M) is one of the Industry Practices within Accenture’s S&C Global Network (GN) team. It focuses in serving clients across specific Industries – Communications, Media & Entertainment. Communications – Focuses primarily on industries related with telecommunications and information & communication technology (ICT). This dynamic team serves most of the world’s leading wireline, wireless, cable and satellite communications and service providers. Media & Entertainment – Focuses on industries like broadcast, entertainment, print and publishing. Globally, Accenture Comms & Media practice works to develop value growth strategies for its clients, who are top-notch organizations, and help improve their offers and go-to-market performance and maximize organizational effectiveness. We work on end-to-end projects delivering management and technology consultancy to help our clients achieve greater profitability, quality, and reliability. From multi-year major systems integration transformation deals to shorter more agile engagements, we have a rapidly expanding portfolio of hyper-growth clients and an increasing footprint with next-gen technology and industry practices, with the following requirements: Deep expertise in one or more Telecom Domains such as Cloud BSS, Telco on Cloud, AI/GenAI, Customer Experience, SMB, Order Management & Billing for B2B / B2C Client facing experience working directly or indirectly with North America ICT clients; preferably international Onsite experience Lead delivery of small to medium-size teams to deliver management consulting projects for North America clients. Lead innovation transformation programs and process enablement for our clients Take responsibility within Comms & Media industry group or across the Products group, help build the practice, track metrics, and so on. Develop assets and methodologies, point-of-view, research, or white papers for use by the team and larger community. Support North America sales team to identify and win potential opportunities within the practice. Help in drafting proposals as an expert for domain areas. Lead proposals, business development efforts and coordinate with other colleagues to create consensus-driven deliverables. Understand customer needs and translate them to business requirements, business process flows and functional requirements Experience to engage with stakeholders independently Execute a transformational change plan aligned with client’s business strategy and context for change. Engage stakeholders in the change journey and build commitment for change. Bring your best skills forward to excel in the role: Skills in one or more Telecom areas Should have an excellent knowledge on various BSS modules and Telco Journeys such as CRM, Order Management, Billing, Mediation, Provisioning, Collections, Channels, Customer Care, Lead to Cash Digital Transformation - Proven experience in Strategy, Innovation and Digital initiatives across Digital Maturity models, CSPs Operating model, Innovation Barometers, Intelligent operations for CSPs and other related areas Cloud BSS - Determine the appropriate Cloud deployment model & design BSS journey to cloud strategies engineered to accelerate ROI and performance. Good to have knowledge of platforms like AWS, Azure, SFDC, GCP, ServiceNow Business Strategy - Leading/managing strategic initiatives and develop project plans, frame and conduct insightful analyses, identify solutions, and develop business cases and implementation plans for CSPs across the globe Transformation & Project Governance - Drive profitability and continued success though managing service quality, cost and leadership of the people delivering services across projects/ programs/portfolios of all scale Should have understanding of lean concepts and hands-on experience in delivering technology-driven business transformation projects using agile practices Experience in agile related tools like JIRA Confluence/Rally/MS Projects/VersionOne Certification profession in PSM/CSM/SAFe/ICP-ACC Skills in one or more roles Experience in the role of Functional Business Analyst, Product Owners, Process Designers, Service Designers, Scrum Masters, Program Delivery Managers Business Analysis - Gather requirements from business and prepared requirement documents. Propose solutions to the client based on gap analysis for the existing Telco platforms. Analyse large data to conduct analysis and present insights with visualisations Process Improvement - Understand issues with the current processes which can be resolved either through technology or process solutions and design detail level to-be process with all stakeholders Value Architect and Tracking - Create value driver trees to breakdown into value components of business objectives and value drivers Other Required Skills Communication and Presentation - Plan and deliver well-structured oral and written communications Structured Problem Solving - Help identify and structure key client challenges into hypotheses and conduct analyses to address the challenges Stakeholder Management - Manage mid-level to senior client leadership and lead conversations Impeccable team management skills with an ability to engage effectively with multiple stakeholders Strong program management skills Cross-cultural competence with an ability to thrive in a dynamic environment

Posted 2 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies