Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
2.0 - 5.0 years
4 - 7 Lacs
Chennai
Hybrid
Work Mode: Hybrid Interview Mode: Virtual (2 Rounds) Key Skills & Responsibilities Strong hands-on experience with Snowflake database design, coding, and documentation. Expertise in performance tuning for both Oracle and Snowflake. Experience as an Apps DBA, capable of coordinating with application teams. Proficiency in using OEM, Tuning Advisor, and analyzing AWR reports. Strong SQL skills with the ability to guide application teams on improvements. Efficient management of compute and storage in Snowflake architecture. Execute administrative tasks, handle multiple Snowflake accounts, and apply best practices. Implement data governance via column-level security, dynamic masking, and RBAC. Utilize Time Travel, Cloning, replication, and recovery methods. Manage DML/DDL operations, concurrency models, and security policies. Enable secure data sharing internally and externally. Skills: documentation,apps dba,replication,dml/ddl operations,performance tunning,compute management,rbac,security policies,coding,recovery methods,dynamic masking,cloning,oem,storage management,performance tuning,secure data sharing,time travel,tuning advisor,snowflake database design,snowflake,column-level security,data governance,concurrency models,column-level security, dynamic masking, and rbac,oracle,awr reports analysis,sql
Posted 1 day ago
2.0 - 5.0 years
4 - 7 Lacs
Bengaluru
Work from Office
Role 1: Snowflake Developer (Coding, Documentation) Locations : Multiple location (Bangalore , Hyderabad , Chennai , kolkata , Mumbai , Pune , Gurugram) Work Mode: Hybrid Interview Mode: Virtual (2 Rounds) Budget: 18L(Max) Key Skills & Responsibilities Strong hands-on experience with Snowflake database design, coding, and documentation. Expertise in performance tuning for both Oracle and Snowflake. Experience as an Apps DBA, capable of coordinating with application teams. Proficiency in using OEM, Tuning Advisor, and analyzing AWR reports. Strong SQL skills with the ability to guide application teams on improvements. Efficient management of compute and storage in Snowflake architecture. Execute administrative tasks, handle multiple Snowflake accounts, and apply best practices. Implement data governance via column-level security, dynamic masking, and RBAC. Utilize Time Travel, Cloning, replication, and recovery methods. Manage DML/DDL operations, concurrency models, and security policies. Enable secure data sharing internally and externally. Skills: documentation,apps dba,replication,dml/ddl operations,performance tunning,compute management,rbac,security policies,coding,recovery methods,dynamic masking,cloning,oem,storage management,performance tuning,secure data sharing,time travel,tuning advisor,snowflake database design,snowflake,column-level security,data governance,concurrency models,column-level security, dynamic masking, and rbac,oracle,awr reports analysis,sql
Posted 1 day ago
8.0 - 13.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Date Location: Bangalore, KA, IN Company Alstom At Alstom, we understand transport networks and what moves people. From high-speed trains, metros, monorails, and trams, to turnkey systems, services, infrastructure, signalling and digital mobility, we offer our diverse customers the broadest portfolio in the industry. Every day, 80,000 colleagues lead the way to greener and smarter mobility worldwide, connecting cities as we reduce carbon and replace cars. Could you be the full-time **Data Solutions Manager** in **[Insert Location]** were looking for Take on a new challenge and apply your **data science and technical leadership** expertise in a cutting-edge field. Youll work alongside **collaborative and innovative** teammates. You'll play a pivotal role in shaping and sustaining advanced data solutions that drive our industrial programs. Day-to-day, youll work closely with teams across the business (**engineering, IT, and program management**), **define and develop scalable data solutions**, and much more. Youll specifically take care of **designing production-grade, cyber-secure data solutions**, but also **applying AI techniques to enhance data utilization for key indicators**. Well look to you for: Managing the team to ensure technical excellence and process adherence Designing scalable, multi-tenant data collectors and storage systems Building streaming and batch data processing pipelines Developing SQL and NoSQL data models Assessing and enhancing the quality of incoming data flows Applying advanced AI techniques and data management/security components Creating customizable analytical dashboards Evaluating opportunities presented by emerging technologies Implementing strong testing and quality assurance practices All about you We value passion and attitude over experience. Thats why we dont expect you to have every single skill. Instead, weve listed some that we think will help you succeed and grow in this role: Engineering degree or equivalent 8+ years of experience in IT, digital companies, software, or startups Proficiency in data processing and software development using tools like QlikSense, PowerApps, Power BI, or Java/Scala Experience with Apache Spark and other data processing frameworks Strong statistical skills (e.g., probability theories, regression, hypothesis testing) Expertise in machine learning techniques and algorithms (e.g., Logistic Regression, Decision Trees, Clustering) Proficiency in data science methods (CRISP-DM, feature engineering, model evaluation) Experience with Python and R libraries (NumPy, Pandas, Scikit) Deep knowledge of SQL database configuration (e.g., Postgres, MariaDB, MySQL) Familiarity with DevOps tools (e.g., Docker, Ansible) and version control (e.g., Git) Knowledge of cloud platforms (Azure, AWS, GCP) is a plus Understanding of network and security principles (e.g., SSL, certificates, IPSEC) Fluent in English; French is a plus Things youll enjoy Join us on a life-long transformative journey the rail industry is here to stay, so you can grow and develop new skills and experiences throughout your career. Youll also: Enjoy stability, challenges, and a long-term career free from boring daily routines Work with cutting-edge security standards for rail data solutions Collaborate with transverse teams and supportive colleagues Contribute to innovative and impactful projects Utilise our **flexible and inclusive** working environment Steer your career in whatever direction you choose across functions and countries Benefit from our investment in your development through award-winning learning opportunities Progress towards leadership roles in data science and digital transformation Benefit from a fair and dynamic reward package that recognises your performance and potential, plus comprehensive and competitive social coverage (life, medical, pension) You dont need to be a train enthusiast to thrive with us. We guarantee that when you step onto one of our trains with your friends or family, youll be proud. If youre up for the challenge, wed love to hear from you! As a global business, were an equal-opportunity employer that celebrates diversity across the 63 countries we operate in. Were committed to creating an inclusive workplace for everyone.
Posted 1 day ago
12.0 - 17.0 years
12 - 16 Lacs
Bengaluru
Work from Office
Date 11 Feb 2025 Location: Bangalore, KA, IN Company Alstom Leading societies to a low carbon future, Alstom develops and markets mobility solutions that provide the sustainable foundations for the future of transportation. Our product portfolio ranges from high-speed trains, metros, monorail, and trams to integrated systems, customised services, infrastructure, signalling and digital mobility solutions. Joining us means joining a caring, responsible, and innovative company where more than 70,000 people lead the way to greener and smarter mobility, worldwide Qualifications & Skills Process Manufacturing Engineering experience of 12+Years. Technical knowledge of Manufacturing Engineering, preparing manufacturing work instructions, line balancing and process routing. Experience in working on DELMIA 16x and above, for Bill of Materials, Routings, Work instructions Familiar with ENOVIA & CATIA V5/V6. Process FMEA, QRQC. Key Responsibilities: Interface with key stakeholders from CDS / TDS sites. Understand the Sites WoW (Way Of Working), Methodologies and Processes. Review all Technical Specifications & STDs document. Create / manage PLM templates and 3PL / Part library within PLM. Creation of Manufacturing BOM on 3D Experience tool ( In Delmia environment). Map product manufacturing process on 3D Experience tool ( In Delmia environment). Creating routing and work instructions on 3D Experience tool ( In Delmia environment). Knowledge on Time analysis (MTM, MEK, UAS) and Technical drawings. Restructure / Re-create Mfg. drawings with all required details and update title blocks as required. Ensure the final output ( 3D CAD, 2D Dwgs, Engineering Data and Documents) are as per specifications. Be responsible for Product Structure in 3DX Generate / Validate 2D and 3D in compliance with QCD commitments, Mtier rules and processes. Verify consistency of digital mock-up vs. legacy. Engineering Documents and Deliverables attached in the Product Structure. Product Baseline and configuration. Deliver weekly progress report to CDS / TDS and secure E-BOM validation at the end of reconstruction. Alstom is the leading company in the mobility sector, solving the most interesting challenges for tomorrows mobility. Thats why we value inquisitive and innovative people who are passionate about working together to reinvent mobility, making it smarter and more sustainable. Day after day, we are building an agile, inclusive and responsible culture, where a diverse group of people are offered opportunities to learn, grow and advance in their careers, with options across functions and geographic locations. Are you ready to join a truly international community of great people on a challenging journey with a tangible impact and purpose Equal opportunity statement: Alstom is an equal opportunity employer committed to creating an inclusive working environment where all our employees are encouraged to reach their full potential, and individual differences are valued and respected. All qualified applicants are considered for employment without regard to race, colour, religion, gender, sexual orientation, gender identity, age, national origin, disability status, or any other characteristic protected by local law.
Posted 1 day ago
0 years
0 Lacs
Mysore, Karnataka, India
On-site
Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and Azure Cloud Data Platform Preferred Education Master's Degree Required Technical And Professional Expertise Strong and proven background in Information Technology & working knowledge of .NET Core, C#, REST API, LINQ, Entity Framework, XUnit. Troubleshooting issues related to code performance. Working knowledge of Angular 15 or later, Typescript, Jest Framework, HTML 5 and CSS 3 & MS SQL Databases, troubleshooting issues related to DB performance Good understanding of CQRS, mediator, repository pattern. Good understanding of CI/CD pipelines and SonarQube & messaging and reverse proxy Preferred Technical And Professional Experience Good understanding of AuthN and AuthZ techniques like (windows, basic, JWT). Good understanding of GIT and it’s process like Pull request. Merge, pull, commit Methodology skills like AGILE, TDD, UML Show more Show less
Posted 1 day ago
5.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Our Company Changing the world through digital experiences is what Adobe’s all about. We give everyone—from emerging artists to global brands—everything they need to design and deliver exceptional digital experiences! We’re passionate about empowering people to create beautiful and powerful images, videos, and apps, and transform how companies interact with customers across every screen. We’re on a mission to hire the very best and are committed to creating exceptional employee experiences where everyone is respected and has access to equal opportunity. We realize that new ideas can come from everywhere in the organization, and we know the next big idea could be yours! Our Company Changing the world through digital experiences is what Adobe’s all about! We give everyone—from emerging artists to global brands—everything they need to design and deliver exceptional digital experiences. We’re passionate about empowering people to craft alluring and powerful images, videos, and apps, and transform how companies harmonize with customers across every screen. We’re on a mission to hire the very best and are committed to building exceptional employee experiences where everyone is respected and has access to equal opportunity. We realize that new insights can come from everywhere in the organization, and we know the next big idea could be yours. The Opportunity Use your expertise in data science engineering to drive the next stage of growth at Adobe. The Customer Analytics & GTM team is focused on using the power of data to deliver optimized experiences through personalization. This role will drive data engineering for large-scale data science initiatives across a wide variety of strategic projects. As a member of the Data Science Engineering team, you will have significant responsibility to help build large scale cloud-based data and analytics platform with enterprise-wide consumers. This role is inherently multi-functional, and the ideal candidate will work across teams. The position requires ability to own things, come up with innovative solutions, try new tools, technologies, and entrepreneurial personality. Come join us for a truly exciting career, best benefits and outstanding work life balance. What You Will Do Build fault tolerant, scalable, quality data pipelines using multiple cloud- based tools. Develop analytical, personalization capabilities using pioneering technologies by bringing to bear Adobe tools. Build LLM agents to optimize and automate data pipelines following the best engineering practices. Deliver End to End Data Pipelines to run Machine Learning Models in a production platform. Innovative solutions to help broader organization take significant actions fast and efficiently. Chip into data engineering and data science frameworks, tools, and processes. Implement outstanding data operations and implement standard methodologies to use resources in an optimum way. Architect data ingestion, data transformation, data consumption, data governance frameworks. Help build production grade ML models and integration with operational systems. This is a high visibility role for a team which is on a critical mission to stop software privacy. A lot of collaboration with global multi-functional operations teams is required to onboard the customers to use genuine software. Work in a collaborative environment and contribute to the team as well as organization’s success. What You Will Need Bachelor’s degree in computer science or equivalent. Master’s degree or equivalent experience is preferred. 5-8 years of consistent track record as a data engineer. At least 2+ years of demonstrable experience and proven track record with Mobile data ecosystem is a must. App Store Optimization (ASO), 3rd Party systems like Branch, Revenue Cat, Google and Apple APIs etc. building data pipelines for In App purchases, Paywall impressions and tracking, App crashes etc. 5+ years validated ability in distributed data technologies e.g., Hadoop, Hive, Presto, Spark etc. 3+ years of experience with Cloud based technologies – Databricks, S3, Azure Blob Storage, Notebooks, AWS EMR, Athena, Glue etc. Familiarity and usage of different file formats in batch/streaming processing i.e., Delta/Parquet/ORC etc. 2+ years’ experience with streaming data ingestion and transformation using Kafka, Kinesis etc. Outstanding SQL experience. Ability to write optimized SQLs across platforms. Proven hands-on experience in Python/PySpark/Scala and ability to manipulate data using Pandas, NumPy, Koalas etc. and using APIs to transfer data. Experience working as an architect to design large scale distributed data platforms. Experience with CI/CD tools i.e., GitHub, Jenkins etc. Working experience with Open- source orchestration tools i.e., Apache Air Flow/ Azkaban etc. Teammate with excellent communication/teamwork skills when it comes to closely working with data scientists and machine learning engineers daily. Hands-on work experience with Elastic Stack (Elastic, Logstash, Kibana) and Graph Databases (neo4j, Neptune etc.) is highly desired. Work experience with ML algorithms & frameworks i.e., Keras, Tensor Flow, PyTorch, XGBoost, Linear Regression, Classification, Random Forest, Clustering, mlFlow etc. Nice to have Showcase your work if you are an open - source contributor. Passion to contribute to Open-source community is highly valued. Experience with Data Governance tools e.g., Collibra and Collaboration tools e.g., JIRA/ Confluence etc. Familiarity with Adobe tools like Adobe Experience Platform, Adobe Analytics, Customer Journey Analytics, Adobe Journey Optimizer is a plus. Experience with LLM Models/ Agentic workflows using Copilot, Claude, LLAMA, Databricks Genie etc. is highly preferred. Opportunity and affirmative action employer. We do not discriminate based on gender, race or color, ethnicity or national origin, age, disability, religion, sexual orientation, gender identity or expression, veteran status, or any other applicable characteristics protected by law. Learn more. Adobe aims to make Adobe.com accessible to any and all users. If you have a disability or special need that requires accommodation to navigate our website or complete the application process, email accommodations@adobe.com or call (408) 536-3015. Adobe values a free and open marketplace for all employees and has policies in place to ensure that we do not enter into illegal agreements with other companies to not recruit or hire each other’s employees. Adobe is proud to be an Equal Employment Opportunity employer. We do not discriminate based on gender, race or color, ethnicity or national origin, age, disability, religion, sexual orientation, gender identity or expression, veteran status, or any other applicable characteristics protected by law. Learn more about our vision here. Adobe aims to make Adobe.com accessible to any and all users. If you have a disability or special need that requires accommodation to navigate our website or complete the application process, email accommodations@adobe.com or call (408) 536-3015. Show more Show less
Posted 1 day ago
7.0 - 10.0 years
0 Lacs
Gurugram, Haryana, India
Remote
Job Description Introduction: A Career at HARMAN Digital Transformation Solutions (DTS) We’re a global, multi-disciplinary team that’s putting the innovative power of technology to work and transforming tomorrow. At HARMAN DTS, you solve challenges by creating innovative solutions. Combine the physical and digital, making technology a more dynamic force to solve challenges and serve humanity’s needs Work at the convergence of cross channel UX, cloud, insightful data, IoT and mobility Empower companies to create new digital business models, enter new markets, and improve customer experiences About The Role Big Data System architecture skills to design and implement scalable processing and storage of large datasets. What You Will Do Big Data System architecture skills to design and implement scalable processing and storage of large datasets. Experience with Scala/Python programming languages Experience working with AWS Services - S3, Lambda Functions and others Understanding of Apache Spark fundamentals, experience working with Hive Extensive knowledge on Spark internals, ensuring stability and troubleshooting issues with performance. Knowledge of Elasticsearch queries via programmatic and console methods Experience with streaming frameworks such as Apache Kafka, maintenance and ensuring reliability of messaging queues. Containerization of applications using docker, deployment and management of applications on Kubernetes What You Need Bachelor’s or master’s degree in computer science or a related field. 7-10 years relevant and Proven experience. Experience working in cross-functional teams and collaborating effectively with different stakeholders. Strong problem-solving and analytical skills. Excellent communication skills to document and present technical concepts clearly. What Is Nice To Have Bachelor’s or master’s degree in computer sciencea related field. 5-10 years relevant and Proven experience in OAM development. What Makes You Eligible Any offer of employment is conditioned upon the successful completion of a background investigation and drug screen. Dedicated performer & team player with the ability to advocate appropriately for product quality. Relentless learner with a dedication to learn new technologies and test methods Self-driven and Innovative to drive continuous improvements in Test process Resourcefulness in triaging problems and coordinating with multiple teams for issue resolution Strong written, verbal communication and inter personal relationship skills What We Offer Flexible work environment, allowing for full-time remote work globally for positions that can be performed outside a HARMAN or customer location Access to employee discounts on world-class Harman and Samsung products (JBL, HARMAN Kardon, AKG, etc.) Extensive training opportunities through our own HARMAN University Competitive wellness benefits Tuition reimbursement “Be Brilliant” employee recognition and rewards program An inclusive and diverse work environment that fosters and encourages professional and personal development You Belong Here HARMAN is committed to making every employee feel welcomed, valued, and empowered. No matter what role you play, we encourage you to share your ideas, voice your distinct perspective, and bring your whole self with you – all within a support-minded culture that celebrates what makes each of us unique. We also recognize that learning is a lifelong pursuit and want you to flourish. We proudly offer added opportunities for training, development, and continuing education, further empowering you to live the career you want. About HARMAN: Where Innovation Unleashes Next-Level Technology Ever since the 1920s, we’ve been amplifying the sense of sound. Today, that legacy endures, with integrated technology platforms that make the world smarter, safer, and more connected. Across automotive, lifestyle, and digital transformation solutions, we create innovative technologies that turn ordinary moments into extraordinary experiences. Our renowned automotive and lifestyle solutions can be found everywhere, from the music we play in our cars and homes to venues that feature today’s most sought-after performers, while our digital transformation solutions serve humanity by addressing the world’s ever-evolving needs and demands. Marketing our award-winning portfolio under 16 iconic brands, such as JBL, Mark Levinson, and Revel, we set ourselves apart by exceeding the highest engineering and design standards for our customers, our partners and each other. If you’re ready to innovate and do work that makes a lasting impact, join our talent community today ! Show more Show less
Posted 1 day ago
6.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Company Description Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. Job Description The Role We’re hiring a Data Engineering Lead to help scale and guide a growing team of data engineers. This role is ideal for someone who enjoys solving technical challenges hands-on while also shaping engineering best practices, coaching others, and helping cross-functional teams deliver data products with clarity and speed. You’ll manage a small team of ICs responsible for building and maintaining pipelines that support reporting, analytics, and machine learning use cases. You’ll be expected to drive engineering excellence — from code quality to deployment hygiene — and play a key role in sprint planning, architectural discussions, and stakeholder collaboration. This is a critical leadership role as our data organization expands to meet growing demand across media performance, optimization, customer insights, and advanced analytics. What You’ll Do Lead and grow a team of data engineers working across ETL/ELT, data warehousing, and ML-enablement Own team delivery across sprints, including planning, prioritization, QA, and stakeholder communication Set and enforce strong engineering practices around code reviews, testing, observability, and documentation Collaborate cross-functionally with Analytics, BI, Revenue Operations, and business stakeholders in Marketing and Sales Guide technical architecture decisions for our pipelines on GCP (BigQuery, GCS, Composer) Model and transform data using dbt and SQL, supporting reporting, attribution, and optimization needs Ensure data security, compliance, and scalability — especially around first-party customer data Mentor junior engineers through code reviews, pairing, and technical roadmap discussions What You Bring 6+ years of experience in data engineering, including 2+ years of people management or formal team leadership Strong technical background with Python, Spark, Kafka, and orchestration tools like Airflow Deep experience working in GCP, especially BigQuery, GCS, and Composer Strong SQL skills and familiarity with DBT for modeling and documentation Clear understanding of data privacy and governance, including how to safely manage and segment first-party data Experience working in agile environments, including sprint planning and ticket scoping Excellent communication skills and proven ability to work cross-functionally across global teams. Nice to have Experience leading data engineering teams in digital media or performance marketing environments Familiarity with data from Google Ads, Meta, TikTok, Taboola, Outbrain, and Google Analytics (GA4) Exposure to BI tools like Tableau or Looker Experience collaborating with data scientists on ML workflows and experimentation platforms Knowledge of data contracts, schema versioning, or platform ownership patterns Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves Show more Show less
Posted 1 day ago
15.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
dunnhumby is the global leader in Customer Data Science, empowering businesses everywhere to compete and thrive in the modern data-driven economy. We always put the Customer First. Our mission: to enable businesses to grow and reimagine themselves by becoming advocates and champions for their Customers. With deep heritage and expertise in retail – one of the world’s most competitive markets, with a deluge of multi-dimensional data – dunnhumby today enables businesses all over the world, across industries, to be Customer First. dunnhumby employs nearly 2,500 experts in offices throughout Europe, Asia, Africa, and the Americas working for transformative, iconic brands such as Tesco, Coca-Cola, Meijer, Procter & Gamble and Metro. We’re looking for a Senior Engineering Manager who expects more from their career . It’s a chance to extend and improve dunnhumby’s Software Engineering Department. It’s an opportunity to work with a market-leading business to explore new opportunities for us and influence global retailers. As a Senior Engineering Manager, you will be responsible for leading and inspiring multiple engineering teams to deliver high-quality, innovative software products that drive business growth. You will set the technical direction, build high-performing teams, and foster a culture of engineering excellence. Required Skills 15+ years of experience in software engineering, with at least 3+ years leading global teams. Proven experience as a Engineering Manager(senior), Lead Engineer, or Tech manager, managing complex engineering projects. Strong expertise in distributed systems, cloud architecture (GCP & Azure), microservices, API design, and scalable platform engineering. In-depth knowledge and hands-on experience with .NET, Python, Spark, Git (GitLab), Docker, Kubernetes and Cloud development (GCP & Azure). Experience working with Javascript(React/Angular Etc) Strong knowledge of DevOps, CI/CD pipelines, observability, and cloud security best practices. Ability to drive engineering strategy, process improvements, and high-velocity agile execution. Experience hiring, mentoring, and leading global teams across multiple time zones. Excellent stakeholder management, communication, and decision-making skills, working cross-functionally with PMs, UX, and Business Leaders. Passion for continuous learning, innovation, and staying ahead of technology trends. What You Can Expect From Us We won’t just meet your expectations. We’ll defy them. So you’ll enjoy the comprehensive rewards package you’d expect from a leading technology company. But also, a degree of personal flexibility you might not expect. Plus, thoughtful perks, like flexible working hours and your birthday off. You’ll also benefit from an investment in cutting-edge technology that reflects our global ambition. But with a nimble, small-business feel that gives you the freedom to play, experiment and learn. And we don’t just talk about diversity and inclusion. We live it every day – with thriving networks including dh Gender Equality Network, dh Proud, dh Family, dh One, dh Enabled and dh Thrive as the living proof. We want everyone to have the opportunity to shine and perform at your best throughout our recruitment process. Please let us know how we can make this process work best for you. Our approach to Flexible Working At dunnhumby, we value and respect difference and are committed to building an inclusive culture by creating an environment where you can balance a successful career with your commitments and interests outside of work. We believe that you will do your best at work if you have a work / life balance. Some roles lend themselves to flexible options more than others, so if this is important to you please raise this with your recruiter, as we are open to discussing agile working opportunities during the hiring process. For further information about how we collect and use your personal information please see our Privacy Notice which can be found (here) Show more Show less
Posted 1 day ago
10.0 years
0 Lacs
Greater Kolkata Area
On-site
Join our Team About this opportunity: We are seeking a highly skilled, hands-on AI Architect - GenAI to lead the design and implementation of production-grade, cloud-native AI and NLP solutions that drive business value and enhance decision-making processes. The ideal candidate will have a robust background in machine learning, generative AI, and the architecture of scalable production systems. As an AI Architect, you will play a key role in shaping the direction of advanced AI technologies and leading teams in the development of cutting-edge solutions. What you will do: Architect and design AI and NLP solutions to address complex business challenges and support strategic decision-making. Lead the design and development of scalable machine learning models and applications using Python, Spark, NoSQL databases, and other advanced technologies. Spearhead the integration of Generative AI techniques in production systems to deliver innovative solutions such as chatbots, automated document generation, and workflow optimization. Guide teams in conducting comprehensive data analysis and exploration to extract actionable insights from large datasets, ensuring these findings are communicated effectively to stakeholders. Collaborate with cross-functional teams, including software engineers and data engineers, to integrate AI models into production environments, ensuring scalability, reliability, and performance. Stay at the forefront of advancements in AI, NLP, and Generative AI, incorporating emerging methodologies into existing models and developing new algorithms to solve complex challenges. Provide thought leadership on best practices for AI model architecture, deployment, and continuous optimization. Ensure that AI solutions are built with scalability, reliability, and compliance in mind. The skills you bring: Minimum of 10+ years of experience in AI, machine learning, or a similar role, with a proven track record of delivering AI-driven solutions. Hands-on experience in designing and implementing end-to-end GenAI-based solutions, particularly in chatbots, document generation, workflow automation, and other generative use cases. Expertise in Python programming and extensive experience with AI frameworks and libraries such as TensorFlow, PyTorch, scikit-learn, and vector databases. Deep understanding and experience with distributed data processing using Spark. Proven experience in architecting, deploying, and optimizing machine learning models in production environments at scale. Expertise in working with open-source Generative AI models (e.g., GPT-4, Mistral, Code-Llama, StarCoder) and applying them to real-world use cases. Expertise in designing cloud-native architectures and microservices for AI/ML applications. Why join Ericsson? At Ericsson, you´ll have an outstanding opportunity. The chance to use your skills and imagination to push the boundaries of what´s possible. To build solutions never seen before to some of the world’s toughest problems. You´ll be challenged, but you won’t be alone. You´ll be joining a team of diverse innovators, all driven to go beyond the status quo to craft what comes next. What happens once you apply? Click Here to find all you need to know about what our typical hiring process looks like. Encouraging a diverse and inclusive organization is core to our values at Ericsson, that's why we champion it in everything we do. We truly believe that by collaborating with people with different experiences we drive innovation, which is essential for our future growth. We encourage people from all backgrounds to apply and realize their full potential as part of our Ericsson team. Ericsson is proud to be an Equal Opportunity Employer. learn more. Primary country and city: India (IN) || Kolkata Req ID: 763161 Show more Show less
Posted 1 day ago
6.0 - 11.0 years
15 - 20 Lacs
Chennai, Bengaluru
Hybrid
Total Exp-6+Yrs 3+ years of experience in data engineering , preferably with real-time systems. Proficient with Python, SQL, and distributed data systems (Kinesis, Spark, Flink, etc.). Strong understanding of event-driven architectures , data lakes , and message serialization . Experience with sensor data processing , telemetry ingestion , or mobility data is a plus. Familiarity with Docker , CI/CD , Kubernetes , and cloud-native architectures. Familiarity with building data pipelines & its workflows (eg: Airflow).
Posted 1 day ago
10.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Description AWS Sales, Marketing, and Global Services (SMGS) is responsible for driving revenue, adoption, and growth from the largest and fastest growing small- and mid-market accounts to enterprise-level customers including public sector. Amazon has built a global reputation for being the most customer-centric company, a company that customers from all over the world recognize, value, and trust for both our products and services. Amazon has a fast-paced environment where we “Work Hard, Have Fun and Make History.” As an increasing number of enterprises move their critical systems to the cloud, AWS India is in need of highly efficient technical consulting talent to help our largest and strategically important customers navigate the operational challenges and complexities of AWS Cloud. We are looking for Technical Consultants to support our customers creative and transformative spirit of innovation across all technologies, including Compute, Storage, Database, Data Analytics, Application services, Networking, Server-less and more. This is not a sales role, but rather an opportunity to be the principal technical advisor for organizations ranging from start-ups to large enterprises. As a Technical Account Manager, you will be the primary technical point of contact for one or more customers helping to plan, debug, and oversee ongoing operations of business-critical applications. You will get your hands dirty, troubleshooting application, network, database, and architectural challenges using a suite of internal AWS Cloud tools as well as your existing knowledge and toolkits. We are seeking individuals with strong backgrounds in I.T. Consulting and in any of these related areas such as Solution Designing, Application and System Development, Database Management, Big Data and Analytics, DevOps Consulting, and Media technologies. Knowledge of programming and scripting is beneficial to the role. Key job responsibilities Every day will bring new and exciting challenges on the job while you: Learn and use Cloud technologies. Interact with leading technologists around the world. Work on critical, highly complex customer problems that may span multiple AWS Cloud services. Apply advanced troubleshooting techniques to provide unique solutions to our customers' individual needs. Work directly with AWS Cloud subject matter experts to help reproduce and resolve customer issues. Write tutorials, how-to videos, and other technical articles for the customer community. Leverage your extensive customer support experience and provide feedback to internal AISPL teams on how to improve our services. Drive projects that improve support-related processes and our customers’ technical support experience. Assist in Design/Architecture of AWS and Hybrid cloud solutions. Help Enterprises define IT and business processes that work well with cloud deployments. Be available outside of business hours to help coordinate the handling of urgent issues as needed. A day in the life A TAM's daily activities involve managing complex technical and critical service events while serving as the principal technical advisor for enterprise customers. They spend their time partnering with customers to optimize AWS usage, tracking operational issues, managing feature requests and launches, while also working directly with internal AWS teams to exceed customer expectations. As a trusted advisor, they provide strategic technical guidance to help plan and build solutions using best practices, while keeping their customers' AWS environments operationally healthy. About The Team Diverse Experiences AWS values diverse experiences. Even if you do not meet all of the preferred qualifications and skills listed in the job description, we encourage candidates to apply. If your career is just starting, hasn’t followed a traditional path, or includes alternative experiences, don’t let it stop you from applying. Why AWS? Amazon Web Services (AWS) is the world’s most comprehensive and broadly adopted cloud platform. We pioneered cloud computing and never stopped innovating — that’s why customers from the most successful startups to Global 500 companies trust our robust suite of products and services to power their businesses. Inclusive Team Culture Here at AWS, it’s in our nature to learn and be curious. Our employee-led affinity groups foster a culture of inclusion that empower us to be proud of our differences. Ongoing events and learning experiences, including our Conversations on Race and Ethnicity (CORE) and AmazeCon (diversity) conferences, inspire us to never stop embracing our uniqueness. Mentorship & Career Growth We’re continuously raising our performance bar as we strive to become Earth’s Best Employer. That’s why you’ll find endless knowledge-sharing, mentorship and other career-advancing resources here to help you develop into a better-rounded professional. Work/Life Balance We value work-life harmony. Achieving success at work should never come at the expense of sacrifices at home, which is why we strive for flexibility as part of our working culture. When we feel supported in the workplace and at home, there’s nothing we can’t achieve in the cloud. Basic Qualifications Bachelor’s Degree in Computer Science, IT, Math, or related discipline required, or equivalent work experience. 10+ years of hands-on Infrastructure / Troubleshooting / Systems Administration / Networking / DevOps / Applications Development experience in a distributed systems environment. External enterprise customer-facing experience as a technical lead, with strong oral and written communication skills, presenting to both large and small audiences. Be mobile and travel to client locations as needed. Preferred Qualifications Experience in a 24x7 operational services or support environment Advanced experience in one or more of the following areas: Software Design or Development, Content Distribution/CDN, Scripting/Automation, Database Architecture, Cloud Architecture, Cloud Migrations, IP Networking, IT Security, Big Data/Hadoop/Spark, Operations Management, Service Oriented Architecture etc. Experience with AWS Cloud services and/or other Cloud offerings. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - AWS India - Delhi Job ID: A2989844 Show more Show less
Posted 1 day ago
12.0 - 15.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Our Company Changing the world through digital experiences is what Adobe’s all about. We give everyone—from emerging artists to global brands—everything they need to design and deliver exceptional digital experiences! We’re passionate about empowering people to create beautiful and powerful images, videos, and apps, and transform how companies interact with customers across every screen. We’re on a mission to hire the very best and are committed to creating exceptional employee experiences where everyone is respected and has access to equal opportunity. We realize that new ideas can come from everywhere in the organization, and we know the next big idea could be yours! The Opportunity Join Adobe in the heart of Bangalore, where brand-new engineering meets outstanding innovation. As a Software Development Engineer, you will play a pivotal role in crafting the future of digital experiences. This is an outstanding opportunity to develop groundbreaking systems and services as part of a multifaceted and ambitious team made up of machine learning engineers, data engineers and front end engineers. Your work will be instrumental in delivering powerful technology that empowers users globally. You will be an experienced backend engineer for the Ai/ML, Data Platform, Search and Recommendations teams of the Adobe Learning Manager. What You’ll Do Build Java based services to power API for search, recommendations, Ai Assistants, reporting and analytics. Build backend systems such as indexing pipelines for search and vector datastores. Build data pipelines such as horizontally scalable data pipelines Provide technical leadership for the design and architecture of systems which are a blend of data, ML and services stacks. Work closely with Machine Learning Scientists, Data Engineers, UX Designers and Product Managers to develop solutions across search, recommendation, Ai Assistants and Data Engineering. Integrate Natural Language Processing (NLP) capabilities into the stack. Do analysis and present key findings, insights and concepts to key influencers and leaders and contribute to building the product roadmap. Deliver highly reliable services with great quality and operational excellence. What you need to succeed A Bachelor's degree in Computer Science or relevant streams. 12 to 15 years of relevant experience. At least 5 years of hands-on experience building micro-services and REST API using Java. At least 5 years of hands-on experience building data pipelines using Big Data technologies such as Hadoop, Spark or Storm. Strong Hands-on experience with RDBMS & NoSQL databases. Strong grasp of fundamentals on web services and distributed computing. Strong background in data engineering and hands-on experience with big data technologies. Strong analytical and problem-solving skills. Hands-on experience with Python, Elastic Search, Spark and Kafka would be a plus. Hands-on experience rolling out Ai and ML Based solutions would be a plus. Enthusiastic about technological trends and eager to innovate. Ability to quickly ramp up on new technologies Proven track record of Engineering generalist resourcefulness. Adobe is proud to be an Equal Employment Opportunity employer. We do not discriminate based on gender, race or color, ethnicity or national origin, age, disability, religion, sexual orientation, gender identity or expression, veteran status, or any other applicable characteristics protected by law. Learn more about our vision here. Adobe aims to make Adobe.com accessible to any and all users. If you have a disability or special need that requires accommodation to navigate our website or complete the application process, email accommodations@adobe.com or call (408) 536-3015. Show more Show less
Posted 1 day ago
2.0 - 5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About Us InMobi is the leading provider of content, monetization, and marketing technologies that fuel growth for industries around the world. Our end-to-end advertising software platform, connected content, and commerce experiences activate audiences, drive real connections, and diversify revenue for businesses everywhere. InMobi Advertising is an end-to-end advertising platform that helps advertisers drive real connections with consumers. We drive customer growth by helping businesses understand, engage, and acquire consumers effectively through data-driven media solutions. Learn more at advertising.inmobi.com. Glance is a consumer technology company that operates disruptive digital platforms, including Glance, Roposo, and Nostra. Glance’s smart lockscreen and TV experience inspires consumers to make the most of every moment by surfing relevant content without the need for searching and downloading apps. Glance is currently available on over 450 million smartphones and televisions worldwide. Learn more at glance.com. Born in India, InMobi maintains a large presence in Bangalore and San Mateo, CA, and has operations in New York, Singapore, Delhi, Mumbai, Beijing, Shanghai, Jakarta, Manila, Kuala Lumpur, Sydney, Melbourne, Seoul, Tokyo, London, and Dubai. To learn more, visit inmobi.com. What is the team like? InMobi Exchange is one of the world's leading advertising platforms handling ~2M ad requests per sec, and serving both publisher and advertiser needs end to end. The Ad-Serving team is responsible for building cutting edge ad-machine which finds and serves the best fitting ad to the end user. As a core member, your code and systems will directly impact revenue daily. What do we expect from you? Experience: 2-5 years development experience. Education: B.E. / B.Tech in Computer Science or equivalent Strong development, coding experience in one or more programming languages like OO Programming (Java), Scala, Spark, Python. Expertise in Data Structures, Algorithms, Concurrency. Experience in Micro-services Architecture, multi-threading, performance-oriented programming and designing skills Good organization, communication and interpersonal skills Must be a proven performer and team player that enjoy challenging assignments in a high-energy, fast growing and start-up workplace Must be a self-starter who can work well with minimal guidance and in fluid environment Provide good attention to details Must be excited by challenges surrounding the development of highly scalable & distributed system for building audience targeting capabilities Agility and ability to adapt quickly to changing requirements and scope and priorities. Nice To Have Skills Experience of online advertising domain Experience of working on massively large scale data systems in production environments Experience in leveraging user data for behavioral targeting and ad-relevance Experience of Big Data analytics domain Experience of building products that are powered by data and insights Experience on hosting and deploying application on public cloud like Msft Azure, GCP, AWS. The InMobi Culture At InMobi, culture isn’t a buzzword; it's an ethos woven by every InMobian, reflecting our diverse backgrounds and experiences. We thrive on challenges and seize every opportunity for growth. Our core values of thinking big, being passionate, showing accountability, and taking ownership with freedom – guide us in every decision we make. We believe in nurturing and investing in your development through continuous learning and career progression with our InMobi Live Your Potential program. InMobi is proud to be an Equal Employment Opportunity and we make reasonable accommodations for qualified individuals with disabilities. Visit https://www.inmobi.com/company/careers to better understand our benefits, values, and more! Show more Show less
Posted 1 day ago
5.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Job Description Alimentation Couche-Tard Inc., (ACT) is a global Fortune 200 company. A leader in the convenience store and fuel space with over 17,000 stores in 31 countries, serving more than 6 million customers each day It is an exciting time to be a part of the growing Data Engineering team at Circle K. We are driving a well-supported cloud-first strategy to unlock the power of data across the company and help teams to discover, value and act on insights from data across the globe. With our strong data pipeline, this position will play a key role partnering with our Technical Development stakeholders to enable analytics for long term success. About The Role We are looking for a Senior Data Engineer with a collaborative, “can-do” attitude who is committed & strives with determination and motivation to make their team successful. A Sr. Data Engineer who has experience architecting and implementing technical solutions as part of a greater data transformation strategy. This role is responsible for hands on sourcing, manipulation, and delivery of data from enterprise business systems to data lake and data warehouse. This role will help drive Circle K’s next phase in the digital journey by modeling and transforming data to achieve actionable business outcomes. The Sr. Data Engineer will create, troubleshoot and support ETL pipelines and the cloud infrastructure involved in the process, will be able to support the visualizations team. Roles and Responsibilities Collaborate with business stakeholders and other technical team members to acquire and migrate data sources that are most relevant to business needs and goals. Demonstrate deep technical and domain knowledge of relational and non-relation databases, Data Warehouses, Data lakes among other structured and unstructured storage options. Determine solutions that are best suited to develop a pipeline for a particular data source. Develop data flow pipelines to extract, transform, and load data from various data sources in various forms, including custom ETL pipelines that enable model and product development. Efficient in ETL/ELT development using Azure cloud services and Snowflake, Testing and operation/support process (RCA of production issues, Code/Data Fix Strategy, Monitoring and maintenance). Work with modern data platforms including Snowflake to develop, test, and operationalize data pipelines for scalable analytics delivery. Provide clear documentation for delivered solutions and processes, integrating documentation with the appropriate corporate stakeholders. Identify and implement internal process improvements for data management (automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability). Stay current with and adopt new tools and applications to ensure high quality and efficient solutions. Build cross-platform data strategy to aggregate multiple sources and process development datasets. Proactive in stakeholder communication, mentor/guide junior resources by doing regular KT/reverse KT and help them in identifying production bugs/issues if needed and provide resolution recommendation. Job Requirements Bachelor’s Degree in Computer Engineering, Computer Science or related discipline, Master’s Degree preferred. 5+ years of ETL design, development, and performance tuning using ETL tools such as SSIS/ADF in a multi-dimensional Data Warehousing environment. 5+ years of experience with setting up and operating data pipelines using Python or SQL 5+ years of advanced SQL Programming: PL/SQL, T-SQL 5+ years of experience working with Snowflake, including Snowflake SQL, data modeling, and performance optimization. Strong hands-on experience with cloud data platforms such as Azure Synapse and Snowflake for building data pipelines and analytics workloads. 5+ years of strong and extensive hands-on experience in Azure, preferably data heavy / analytics applications leveraging relational and NoSQL databases, Data Warehouse and Big Data. 5+ years of experience with Azure Data Factory, Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Blob Storage, Databricks/Spark, Azure SQL DW/Synapse, and Azure functions. 5+ years of experience in defining and enabling data quality standards for auditing, and monitoring. Strong analytical abilities and a strong intellectual curiosity In-depth knowledge of relational database design, data warehousing and dimensional data modeling concepts Understanding of REST and good API design. Experience working with Apache Iceberg, Delta tables and distributed computing frameworks Strong collaboration and teamwork skills & excellent written and verbal communications skills. Self-starter and motivated with ability to work in a fast-paced development environment. Agile experience highly desirable. Proficiency in the development environment, including IDE, database server, GIT, Continuous Integration, unit-testing tool, and defect management tools. Knowledge Strong Knowledge of Data Engineering concepts (Data pipelines creation, Data Warehousing, Data Marts/Cubes, Data Reconciliation and Audit, Data Management). Strong working knowledge of Snowflake, including warehouse management, Snowflake SQL, and data sharing techniques. Experience building pipelines that source from or deliver data into Snowflake in combination with tools like ADF and Databricks. Working Knowledge of Dev-Ops processes (CI/CD), Git/Jenkins version control tool, Master Data Management (MDM) and Data Quality tools. Strong Experience in ETL/ELT development, QA and operation/support process (RCA of production issues, Code/Data Fix Strategy, Monitoring and maintenance). Hands on experience in Databases like (Azure SQL DB, MySQL/, Cosmos DB etc.), File system (Blob Storage), Python/Unix shell Scripting. ADF, Databricks and Azure certification is a plus. Technologies we use: Databricks, Azure SQL DW/Synapse, Azure Tabular, Azure Data Factory, Azure Functions, Azure Containers, Docker, DevOps, Python, PySpark, Scripting (Powershell, Bash), Git, Terraform, Power BI, Snowflake Show more Show less
Posted 1 day ago
3.0 years
0 Lacs
India
On-site
About Oportun Oportun (Nasdaq: OPRT) is a mission-driven fintech that puts its 2.0 million members' financial goals within reach. With intelligent borrowing, savings, and budgeting capabilities, Oportun empowers members with the confidence to build a better financial future. Since inception, Oportun has provided more than $16.6 billion in responsible and affordable credit, saved its members more than $2.4 billion in interest and fees, and helped its members save an average of more than $1,800 annually. Oportun has been certified as a Community Development Financial Institution (CDFI) since 2009. WORKING AT OPORTUN Working at Oportun means enjoying a differentiated experience of being part of a team that fosters a diverse, equitable and inclusive culture where we all feel a sense of belonging and are encouraged to share our perspectives. This inclusive culture is directly connected to our organization's performance and ability to fulfill our mission of delivering affordable credit to those left out of the financial mainstream. We celebrate and nurture our inclusive culture through our employee resource groups. Position Overview We are growing our world-class team of mission-driven, entrepreneurial Data Scientists who are passionate about broadening financial inclusion by untapping insights from non-traditional data. Be part of the team responsible for developing and enhancing Oportun’s core intellectual property used in scoring risk for underbanked consumers that lack a traditional credit bureau score. In this role you will be on the cutting edge working with large and diverse (i.e. data from dozens of sources including transactional, mobile, utility, and other financial services) alternative data sets and utilize machine learning and statistical modeling to build scores and strategies for managing risk, collection/loss mitigation, and fraud. You will also drive growth and optimize marketing spend across channels by leveraging alternative data to help predict which consumers would likely be interested in Oportun’s affordable, credit building loan product. Responsibilities Develop data products and machine learning models used in Risk, Fraud, Collections, and portfolio management, and provide frictionless customer experience for various products and services Oportun provides. Build accurate and automated monitoring tools which can help us to keep a close eye on the performance of the models and rules. Build model deployment platform which can shorten the time of implementing new models. Build end-to-end reusable pipelines from data acquisition to model output delivery. Lead initiatives to drive business value from start to finish including project planning, communication, and stakeholder management. Lead discussions with Compliance, Bank Partners, and Model Risk Management teams to facilitate the Model Governance Activities such as Model Validations and Monitoring. Qualifications A relentless problem solver and out of the box thinker with a proven track record of driving business results in a timely manner Master’s degree or PhD in Statistics, Mathematics, Computer Science, Engineering or Economics or other quantitative discipline (Bachelor’s degree with significant relevant experience will be considered). Hands on experience leveraging machine learning techniques such as Gradient Boosting, Logistic Regression and Neural Network to solve real world problems 3+ years of hands-on experience with data extraction, cleaning, analysis and building reusable data pipelines; Proficient in SQL, Spark SQL and/or Hive 3+ years of experience in leveraging modern machine learning toolset and programming languages such as Python Excellent written and oral communication skills Strong stakeholder management and project management skills Comfortable in a high-growth, fast-paced, agile environment Experience working with AWS EMR, Sage-maker or other cloud-based platforms is a plus Experience with HDFS, Hive, Shell script and other big data tools is a plus Show more Show less
Posted 1 day ago
50.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About Gap Inc. Our past is full of iconic moments — but our future is going to spark many more. Our brands — Gap, Banana Republic, Old Navy and Athleta — have dressed people from all walks of life and all kinds of families, all over the world, for every occasion for more than 50 years. But we’re more than the clothes that we make. We know that business can and should be a force for good, and it’s why we work hard to make product that makes people feel good, inside and out. It’s why we’re committed to giving back to the communities where we live and work. If you're one of the super-talented who thrive on change, aren't afraid to take risks and love to make a difference, come grow with us. About The Role The Manager of Supplier Management will lead the supplier relationship management function within the Accounts Payable (AP) team. This role is responsible for overseeing and managing the company's supplier base, ensuring timely and accurate vendor information, resolving supplier issues, and optimizing supplier payment processes. The ideal candidate will have a deep understanding of supplier management, AP processes, and strong leadership abilities. What You'll Do Supplier Relationship Management: Develop and maintain strong relationships with key suppliers, ensuring open and effective communication. Address and resolve supplier issues or disputes regarding invoicing, payments, and terms in a timely and professional manner. Work closely with suppliers to understand their needs and improve the overall supplier experience. Supplier Onboarding & Information Management: Lead the supplier onboarding process, ensuring that all relevant supplier information is gathered, verified, and entered into the system accurately. Regularly audit and update supplier information to ensure accuracy and compliance. Collaborate with procurement and legal teams to ensure all contracts and supplier agreements are aligned with company policies. Accounts Payable Collaboration: Collaborate with the AP team to ensure seamless processing of supplier invoices and payments, optimizing cash flow and vendor satisfaction. Oversee the resolution of any discrepancies between suppliers and internal teams (e.g., procurement, finance) to ensure timely payment. Work closely with AP teams to address supplier inquiries, track payment status, and resolve issues related to invoice processing and payment cycles. Process Improvement & Efficiency: Continuously assess and improve supplier management and AP processes to enhance efficiency, reduce errors, and increase automation. Implement and maintain best practices for managing supplier relationships, including effective communication, issue resolution, and performance metrics. Identify opportunities for process optimization within the AP team to support a faster, more efficient payment cycle. Supplier Performance Monitoring: Develop and implement metrics and KPIs to measure supplier performance, ensuring timely deliveries, adherence to terms, and quality standards. Track and report on supplier performance, escalating issues when necessary and working with vendors to improve outcomes. Reporting & Analysis: Generate regular reports on supplier activity, payment cycles, aging analysis, and discrepancies for senior leadership. Provide data-driven insights and recommendations to improve supplier management and accounts payable processes. Compliance & Risk Management: Ensure all supplier management activities comply with internal controls, accounting standards, and regulatory requirements. Identify potential risks in supplier relationships and take proactive steps to mitigate them. Collaboration with Cross-Functional Teams: Partner with procurement, legal, and treasury teams to ensure that supplier terms, contracts, and relationships align with corporate goals. Support cross-functional projects that require supplier coordination, such as system upgrades or new process implementation. Who You Are Bachelor’s degree in Business, Finance, Accounting, or related field. 7+ years of experience in supplier management, accounts payable, or procurement, with at least 3 years in a managerial or leadership role. Strong knowledge of supplier relationship management, procurement processes, and accounts payable operations. Experience with ERP systems (e.g., SAP, Oracle, or similar), supplier management software, and advanced Excel skills. Excellent communication, negotiation, and interpersonal skills, with the ability to manage multiple stakeholder relationships effectively. Strong analytical skills and the ability to assess and improve processes. Demonstrated ability to manage a team, mentor and develop talent, and build cross-functional relationships. Knowledge of compliance regulations, internal controls, and audit processes. High attention to detail and the ability to work under pressure to meet deadlines in a fast-paced environment. Benefits at Gap Inc. One of the most competitive paid time off plans in the industry Comprehensive health coverage for employees, same-sex partners and their families Health and wellness program: free annual health check-ups, fitness center and Employee Assistance Program Comprehensive benefits to support the journey of parenthood Retirement planning assistance See more of the benefits we offer. Gap Inc. is an equal-opportunity employer and is committed to providing a workplace free from harassment and discrimination. We are committed to recruiting, hiring, training and promoting qualified people of all backgrounds, and make all employment decisions without regard to any protected status. We have received numerous awards for our long-held commitment to equality and will continue to foster a diverse and inclusive environment of belonging. In 2022, we were recognized by Forbes as one of the World's Best Employers and one of the Best Employers for Diversity. Show more Show less
Posted 1 day ago
3.0 - 8.0 years
5 - 15 Lacs
Hyderabad
Work from Office
Dear Candidates, We are conducting a Walk-In Interview in Hyderabad for the position of Data Engineering on 20th/21st/22nd June 2025 . Position: Data Engineering Job description: Expert knowledge in AWS Data Lake implementation and support (S3, Glue,DMS Athena, Lambda, API Gateway, Redshift) Handling of data related activities such as data parsing, cleansing quality definition data pipelines, storage and ETL scripts Experiences in programming language Python/Pyspark/SQL Experience with data migration with hands-on experience Experiences in consuming rest API using various authentication options with in AWS Lambda architecture orchestrate triggers, debug and schedule batch job using a AWS Glue, Lambda and step functions understanding of AWS security features such as IAM roles and policies Exposure to Devops tools AWS certification in AWS is highly preferred Mandatory skills for Data engineer: Python/Pyspark, Aws Glue, lambda , redshift. Date: 20th June 2025 to 22nd June 2025 Time : 9.00 AM to 6.00 PM Eligibility: Any Graduate Experience : 2- 10 Years Gender: ANY Interested candidates can walk in directly. For any queries, please contact us at +91 7349369478/ 8555079906 Interview Venue Details: Selectify Analytics Address: Capital Park (Jain Sadguru Capital Park) Ayyappa Society, Silicon Valley, Madhapur, Hyderabad, Telangana 500081 Contact Person: Mr. Deepak/Saqeeb/Ravi Kumar Interview Time: 9.00 AM to 6.00 PM Contact Number : +91 7349369478/ 8555079906
Posted 1 day ago
12.0 years
0 Lacs
Satara, Maharashtra, India
On-site
Join us as a Sourcing Manager in Satara, Maharashtra to be responsible for managing and developing the local supplier base to support the factory’s’ strategic needs. The role ensures cost-effective, timely, and high-quality supply of materials and services while aligning with regional, product group, and global sourcing strategies. About The Job At Alfa Laval, we always go that extra mile to overcome the toughest challenges. Our driving force is to accelerate success for our customers, people and planet. You can only achieve that by having dedicated people with a curious mind. Curiosity is the spark behind great ideas and great ideas drive progress. As a member of our team, you thrive in a truly diverse and inclusive workplace based on care and empowerment. You are here to make a difference. Constantly building bridges to the future with sustainable solutions that have an impact on our planet’s most urgent problems. Making the world a better place every day. About The Position This position is located in Satara, will report to the Factory and Site Manager Satara. In this role, the Sourcing Manager’s focus is to strengthen and further develop the existing supplier base in line with future capacity, quality, sustainability, and innovation needs. This position will manage the sourcing for GPHE, LA and WHE departments. As a part of the team, You Will: Responsible for Supplier Development & Management (existing supplier base!) Drive continuous development of existing local suppliers to improve performance, competitiveness, and capability. Identify and implement opportunities for localization of materials or components in alignment with cost and lead-time reduction goals in line with product groups, and global sourcing strategies. Conduct regular supplier reviews and audits to ensure compliance with quality, safety, sustainability, and contractual requirements. Collaboration and Alignment: Act as the primary interface between the local factory and regional, product groups, and global sourcing teams. Ensure local sourcing activities align with global category strategies and product groups roadmaps. Participate in cross-functional sourcing and development projects, contributing local market insights and supplier capabilities. Within the Product Groups, control, encourage, drive and push improvement for purchased material and suppliers, (Local and Global) Accountable for the Product Groups handshake process to secure a pipeline of purchasing initiatives, right prioritization and follow up of the executions. Drive supply optimization for AL from Product Groups perspective Chair weekly product Group purchasing Improvement meetings (Pre-PIM meetings) and secure escalation of deviations to Global Purchasing (PIM) acording to process Accountable for the Product Groups requirements during the execution of the purchasing projects (Global and Local) Actively contribute to the sourcing strategy and commodity strategy to strive for alignment with the Product Groups. Give input to the Operational plans from sourcing perspective Communicate significant changes of forecast to Global Purchasing. Strategic Sourcing & Cost Management: Lead local sourcing initiatives and support regional/global negotiations by providing data, supplier insights, and local market intelligence. Support cost-reduction programs, make-buy analyses, and dual-sourcing strategies. Monitor and manage local supplier risks and implement mitigation strategies where needed. Operational Procurement Support: Collaborate with planning, quality, engineering, and logistics to resolve supplier performance issues. Ensure timely delivery of goods and services by coordinating closely with internal stakeholders and suppliers. Full understanding of sourcing strategy Full understanding of the supply chain needs and targets within a Product Group Full understanding of the product within the product group Good understanding of supplier and material market situation (material price, competition, risks) Good understanding of the Purchasing Process and commercial deals Full understanding of Material Management Preferably trained in Green Belt and Supply Development What You Know Bachelor’s degree in mechanical or production engineering and supply chain or business administration or related field. Total 12+ years’ experience with minimum 5–7 years of experience in sourcing or procurement, ideally in a manufacturing or industrial setting. Proven experience in supplier development and cross-functional collaboration. Strong negotiation, communication, and analytical skills. Ability to navigate complex stakeholder networks (local, regional, global). Fluent in English; Proactive, results-driven, and hands-on approach. Strong interpersonal and intercultural communication skills. Able to work independently while ensuring alignment with broader sourcing teams. High integrity and commitment to compliance and sustainability standards. Key Relationships Product Groups Sourcing Managers and Sourcing organisation within Product Groups Local Supply Chain Managers Global Sourcing and Commodity Managers (Global Purchasing organisation) Regional Sourcing Manager Factory Managers Physical & Environmental Factors Office environment with frequent attendance on the shop floor. Safety equipment required when present on the shop floor – footwear, hearing, eyewear. Environmental Factors (hazardous materials, work location, work surfaces, exposure). Why Should You Apply We offer you an interesting and challenging position in an open and friendly environment where we help each other to develop and create value for our customers. Exciting place to build a global network with different nationalities. Your work will have a true impact on Alfa Laval’s future success, you will be learning new things every day. "We care about diversity, inclusion and equity in our recruitment processes. We also believe behavioural traits can provide important insights into a candidate's fit to a role. To help us achieve this we apply Pymetrics assessments, and upon application you will be invited to play the assessment games.” Show more Show less
Posted 1 day ago
4.0 - 7.0 years
7 - 14 Lacs
Pune, Mumbai (All Areas)
Work from Office
Job Profile Description Create and maintain highly scalable data pipelines across Azure Data Lake Storage, and Azure Synapse using Data Factory, Databricks and Apache Spark/Scala Responsible for managing a growing cloud-based data ecosystem and reliability of our Corporate datalake and analytics data mart Contribute to the continued evolution of Corporate Analytics Platform and Integrated data model. Be part of Data Engineering team in all phases of work including analysis, design and architecture to develop and implement cutting-edge solutions. Negotiate and influence changes outside of the team that continuously shape and improve the Data strategy 4+ years of experience implementing analytics data Solutions leveraging Azure Data Factory, Databricks, Logic Apps, ML Studio, Datalake and Synapse Working experience with Scala, Python or R Bachelors degree or equivalent experience in Computer Science, Information Systems, or related disciplines.
Posted 1 day ago
3.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Job Description Alimentation Couche-Tard Inc., (ACT) is a global Fortune 200 company. A leader in the convenience store and fuel space with over 17,000 stores in 31 countries, serving more than 6 million customers each day It is an exciting time to be a part of the growing Data Engineering team at Circle K. We are driving a well-supported cloud-first strategy to unlock the power of data across the company and help teams to discover, value and act on insights from data across the globe. With our strong data pipeline, this position will play a key role partnering with our Technical Development stakeholders to enable analytics for long term success. About The Role We are looking for a Data Engineer with a collaborative, “can-do” attitude who is committed & strives with determination and motivation to make their team successful. A Data Engineer who has experience implementing technical solutions as part of a greater data transformation strategy. This role is responsible for hands on sourcing, manipulation, and delivery of data from enterprise business systems to data lake and data warehouse. This role will help drive Circle K’s next phase in the digital journey by transforming data to achieve actionable business outcomes. Roles and Responsibilities Collaborate with business stakeholders and other technical team members to acquire and migrate data sources that are most relevant to business needs and goals Demonstrate technical and domain knowledge of relational and non-relational databases, Data Warehouses, Data lakes among other structured and unstructured storage options Determine solutions that are best suited to develop a pipeline for a particular data source Develop data flow pipelines to extract, transform, and load data from various data sources in various forms, including custom ETL pipelines that enable model and product development Efficient in ELT/ETL development using Azure cloud services and Snowflake, including Testing and operational support (RCA, Monitoring, Maintenance) Work with modern data platforms including Snowflake to develop, test, and operationalize data pipelines for scalable analytics deliver Provide clear documentation for delivered solutions and processes, integrating documentation with the appropriate corporate stakeholders Identify and implement internal process improvements for data management (automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability) Stay current with and adopt new tools and applications to ensure high quality and efficient solutions Build cross-platform data strategy to aggregate multiple sources and process development datasets Proactive in stakeholder communication, mentor/guide junior resources by doing regular KT/reverse KT and help them in identifying production bugs/issues if needed and provide resolution recommendation Job Requirements Bachelor’s degree in Computer Engineering, Computer Science or related discipline, Master’s Degree preferred 3+ years of ETL design, development, and performance tuning using ETL tools such as SSIS/ADF in a multi-dimensional Data Warehousing environment 3+ years of experience with setting up and operating data pipelines using Python or SQL 3+ years of advanced SQL Programming: PL/SQL, T-SQL 3+ years of experience working with Snowflake, including Snowflake SQL, data modeling, and performance optimization Strong hands-on experience with cloud data platforms such as Azure Synapse and Snowflake for building data pipelines and analytics workloads 3+ years of strong and extensive hands-on experience in Azure, preferably data heavy / analytics applications leveraging relational and NoSQL databases, Data Warehouse and Big Data 3+ years of experience with Azure Data Factory, Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Blob Storage, Databricks/Spark, Azure SQL DW/Synapse, and Azure functions 3+ years of experience in defining and enabling data quality standards for auditing, and monitoring Strong analytical abilities and a strong intellectual curiosity. In-depth knowledge of relational database design, data warehousing and dimensional data modeling concepts Understanding of REST and good API design Experience working with Apache Iceberg, Delta tables and distributed computing frameworks Strong collaboration, teamwork skills, excellent written and verbal communications skills Self-starter and motivated with ability to work in a fast-paced development environment Agile experience highly desirable Proficiency in the development environment, including IDE, database server, GIT, Continuous Integration, unit-testing tool, and defect management tools Preferred Skills Strong Knowledge of Data Engineering concepts (Data pipelines creation, Data Warehousing, Data Marts/Cubes, Data Reconciliation and Audit, Data Management) Strong working knowledge of Snowflake, including warehouse management, Snowflake SQL, and data sharing techniques Experience building pipelines that source from or deliver data into Snowflake in combination with tools like ADF and Databricks Working Knowledge of Dev-Ops processes (CI/CD), Git/Jenkins version control tool, Master Data Management (MDM) and Data Quality tools Strong Experience in ETL/ELT development, QA and operation/support process (RCA of production issues, Code/Data Fix Strategy, Monitoring and maintenance) Hands on experience in Databases like (Azure SQL DB, MySQL/, Cosmos DB etc.), File system (Blob Storage), Python/Unix shell Scripting ADF, Databricks and Azure certification is a plus Technologies we use : Databricks, Azure SQL DW/Synapse, Azure Tabular, Azure Data Factory, Azure Functions, Azure Containers, Docker, DevOps, Python, PySpark, Scripting (Powershell, Bash), Git, Terraform, Power BI, Snowflake Show more Show less
Posted 1 day ago
1.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job description We are looking for a Senior Social Media Executive who will be responsible for managing, strategizing, and optimizing our social media presence across various platforms. The ideal candidate should have hands-on experience in content creation, community engagement, performance analysis, and campaign management to drive brand awareness and engagement. Key Responsibilities: Social Media Strategy & Execution: Develop and execute social media strategies to enhance brand visibility and engagement. Manage and optimize social media calendars, ensuring timely and engaging content. Content Creation & Management: Create, curate, and manage high-quality content (text, images, videos, and reels) tailored for each platform. Collaborate with designers, copywriters, and video editors to produce engaging social media content. Community Engagement: Monitor and respond to audience comments, messages, and reviews to maintain a strong brand presence. Engage with influencers, industry professionals, and relevant communities to enhance brand positioning. Performance Tracking & Analytics: Monitor key metrics (engagement, reach, impressions, follower growth, etc.) using tools like Meta Business Suite, Google Analytics, and other social media analytics platforms. Provide insights and recommendations for content and campaign optimization based on data analysis. Paid Social Media Campaigns: Assist in strategizing and managing paid ad campaigns on Meta (Facebook & Instagram), LinkedIn, YouTube, and other platforms. Coordinate with the performance marketing team to track campaign performance and suggest improvements. Trend Monitoring & Innovation: Stay updated with the latest social media trends, platform updates, and industry best practices. Experiment with new content formats and features (Reels, Stories, Lives, Polls, etc.) to drive engagement. Requirements & Qualifications: Minimum 1 year of hands-on experience in social media management and execution. Strong understanding of platforms like Facebook, Instagram, LinkedIn, Twitter, YouTube, and emerging channels. Proficiency in social media tools like Hootsuite, Buffer, Canva, Later, and Meta Business Suite. Basic knowledge of social media ads and paid campaigns. Excellent written and verbal communication skills. Creative mindset with a keen eye for design and aesthetics. Ability to multitask, work under tight deadlines, and adapt to evolving trends. Preferred Qualifications: Experience in handling social media for brands in real estate, fashion, lifestyle, or B2B sectors is a plus. Knowledge of SEO for social media content. Basic video editing and graphic design skills (using Canva, Adobe Spark, or Photoshop). Perks & Benefits: Opportunity to work with a dynamic and creative team. Growth opportunities within the organization. Exposure to various industries and projects. Job Types: Full-time, Permanent Pay: ₹20,000.00 - ₹25,000.00 per month Drop your resume on hr@osumare.in / whatsapp your resume on 9604153943 Show more Show less
Posted 1 day ago
2.0 years
0 Lacs
Gurugram, Haryana, India
Remote
IB English Faculty (DP – Grades 9 to 12) 📍 Location: Gurgaon (1st month onsite) → then Work From Home 💰 Salary: ₹7–8 LPA 📅 Work Days: 6 days/week 🕐 Experience: 1–2 years 🎓 Education: Must have BA & MA in English (Honours only) Not Another English Class. A Sparkl-ing Experience. Do you love teaching literature that makes teenagers think , not just memorize? Do you dream of taking students from Shakespeare to Arundhati Roy with purpose and passion? If yes, Sparkl is looking for you! We’re hiring an IB English Faculty for DP (Grades 9–12) — someone who brings strong academic grounding, school-teaching experience, and that extra spark that makes stories come alive. Who We’re Looking For: ✅ You must have taught English Literature in a formal school or tuition center (CBSE, ICSE, Cambridge, or IB preferred). ✅ You’ve handled school curriculum (not vocational/entrance prep like SAT, TOEFL, SSC, CAT, etc). ✅ You have a Bachelor’s + Master’s degree in English Honours — no exceptions. ✅ You know how to explain literary devices, build essay-writing skills, and get teens talking about theme, tone, and character arcs. ✅ You’re confident, clear, and love working with high-schoolers. What You'll Be Doing: 📚 Teach IB DP English for Grades 9–12 (focus on Literature, writing, comprehension). 📝 Guide students through critical analysis, essay structuring, and academic writing. 📖 Bring texts alive — from Shakespeare to modern prose — in ways students will remember. 🏢 Begin with 1 month of in-person training at our Gurgaon office, then shift to remote work. Why Join Sparkl? ✨ Work with top mentors in the IB space ✨ Teach smart, curious, high-performing students ✨ Young, passionate team and a flexible work environment ✨ Real impact — real growth Show more Show less
Posted 1 day ago
2.0 years
0 Lacs
Gurugram, Haryana, India
Remote
IB Physics Faculty (MYP + DP) 📍 Location: Gurgaon (1st month onsite) → then Work From Home 💰 Salary: ₹7–8 LPA 🕒 6 days/week | Immediate Joiners Preferred Physics = Fun. Who Knew? (You Did.) If you can turn Newton’s laws into a Netflix-worthy explanation, and you genuinely love helping teens get “the point” of Physics — then we want you at Sparkl . We’re looking for a young IB Physics Educator to teach both MYP & DP , someone who can go from talking atoms to astrophysics — and make it fun. The Role Includes: 🔬 Teaching IB Physics to students in Grades 6–12 (MYP & DP) 🧲 Creating energy in the virtual classroom — minus the resistance 🧪 Using experiments, analogies, and storytelling to explain tough concepts 🏢 Starting your journey with 1 month of training in Gurgaon, then fully remote You Should Be Someone Who: ✅ Has 1–2 years of teaching or tutoring experience (IB/IGCSE a plus) ✅ Holds a graduate/postgraduate degree in Physics ✅ Communicates clearly, creatively, and confidently in English ✅ Cares deeply about student learning (not just the syllabus) Why Work With Sparkl? ⚡ Young and fun team, serious about learning 🌎 Teach ambitious, globally-minded students 🧠 Mentorship and training that actually helps you grow 🏡 Work-from-home flexibility after initial onboarding 🌟 Don’t just teach Physics — spark a love for it. Apply today! Show more Show less
Posted 1 day ago
1.0 - 3.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Job Description Job Title – PROJECT CO-ORDINATOR __________________________________ About JLL: We’re JLL—a leading professional services and investment management firm specializing in real estate. We have operations in over 80 countries and a workforce of over 91,000 individuals around the world who help real estate owners, occupiers and investors achieve their business ambitions. As a global Fortune 500 company, we also have an inherent responsibility to drive sustainability and corporate social responsibility. That’s why we’re committed to our purpose to shape the future of real estate for a better world. We’re using the most advanced technology to create rewarding opportunities, amazing spaces and sustainable real estate solutions for our clients, our people and our communities. Our core values of teamwork, ethics and excellence are also fundamental to everything we do and we’re honored to be recognized with awards for our success by organizations both globally and locally. Creating a diverse and inclusive culture where we all feel welcomed, valued and empowered to achieve our full potential is important to who we are today and where we’re headed in the future. And we know that unique backgrounds, experiences, and perspectives help us think bigger, spark innovation and succeed together. If this job description resonates with you, we encourage you to apply even if you don’t meet all of the requirements below. We’re interested in getting to know you and what you bring to the table! __________________________________ Responsibilities: Prepare project management reports and meeting minutes Manage all project documentation including contracts, budgets and schedules Maintain best practices templates on SharePoint site Administrative duties to include but not limited to: copying, coordinating travel arrangements, expense report preparation, organizing lunches, WebEx meetings, etc. Manage accounts receivables according to the guidelines and requirements set by the Facilities Manager, Operations Manager, or project team Ensure that all accounts receivables are maintained at a level not to exceed planned working capital charge as set by corporate finance, the project team and/or the Regional Operations Manager Assist local team in meeting targeting financial numbers as determined on a yearly basis by the Management Executive Committee Proactively manage project-related issues on account or assignment Demonstrate proficiency in the use and application of all project management Prepare PowerPoint presentations, memos, responses to proposals and research Actively collaborate with stakeholders and leverage platform support Assist with client communication, conferences, and events Maintain all files and documents related to project assignment Any and all other duties and tasks assigned Requirements/Qualifications: Bachelor’s degree from an accredited institution required 1-3 years of experience working in a similar role Detail oriented and organized- must have ability to proactively plan for multiple projects at a time Strong communication skills- both written and oral Proficient with Microsoft programs such as PowerPoint, Word, Outlook, etc. Must be a self-starter- able to start and complete projects independently Proactive – does not wait for tasks to be asked but always prompts to identify what else can be done. Customer Focus – dedicated to meeting the expectations and requirements of the external and internal customer, acts with customer in mind, establishes and maintains effective relationships with customers, and gains their trust and respect. Dealing with Ambiguity – can effectively cope with change, can shift gears comfortably, can decide and act without having the total picture Interpersonal Savvy – relates well to all kinds of people, inside and outside the organization uses diplomacy and tact Show more Show less
Posted 1 day ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The demand for professionals with expertise in Spark is on the rise in India. Spark, an open-source distributed computing system, is widely used for big data processing and analytics. Job seekers in India looking to explore opportunities in Spark can find a variety of roles in different industries.
These cities have a high concentration of tech companies and startups actively hiring for Spark roles.
The average salary range for Spark professionals in India varies based on experience level: - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-25 lakhs per annum
Salaries may vary based on the company, location, and specific job requirements.
In the field of Spark, a typical career progression may look like: - Junior Developer - Senior Developer - Tech Lead - Architect
Advancing in this career path often requires gaining experience, acquiring additional skills, and taking on more responsibilities.
Apart from proficiency in Spark, professionals in this field are often expected to have knowledge or experience in: - Hadoop - Java or Scala programming - Data processing and analytics - SQL databases
Having a combination of these skills can make a candidate more competitive in the job market.
As you explore opportunities in Spark jobs in India, remember to prepare thoroughly for interviews and showcase your expertise confidently. With the right skills and knowledge, you can excel in this growing field and advance your career in the tech industry. Good luck with your job search!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.