Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Talent Worx is thrilled to announce an exciting opportunity for the roles of Snowflake and Spark Developers! Join us in revolutionizing the data analytics landscape as we partner with one of the Big 4 firms in India. What impact will you make? Your contributions will play a vital role in shaping our clients' success stories by utilizing innovative technologies and frameworks. Envision a dynamic culture that supports inclusion, collaboration, and exceptional performance. With us, you will discover unrivaled opportunities to accelerate your career and achieve your goals. The Team In our Analytics & Cognitive (A&C) practice, you will find a dedicated team committed to unlocking the value hidden within large datasets. Our globally-connected network ensures that our clients gain actionable insights that support fact-driven decision-making, leveraging advanced techniques including big data, cloud computing, cognitive capabilities, and machine learning. Work you will do As a key player in our organization, you will contribute directly to enhancing our clients' competitive positioning and performance with innovative and sustainable solutions. We expect you to collaborate closely with our teams and clients to deliver outstanding results across various projects. Requirements 5+years of relevant experience in Spark and Snowflake with practical experience in at least one project implementation Strong experience in developing ETL pipelines and data processing workflows using Spark Expertise in Snowflake architecture, including data loading and unloading processes, table structures, and virtual warehouses Proficiency in writing complex SQL queries in Snowflake for data transformation and analysis Experience with data integration tools and techniques, ensuring the seamless ingestion of data Familiarity with building and monitoring data pipelines in a cloud environment Exposure to Agile methodology and tools like Jira and Confluence Strong analytical and problem-solving skills, with meticulous attention to detail Excellent communication and interpersonal skills to foster collaborations with clients and team members Ability to travel as required by project demands Qualifications Snowflake certification or equivalent qualification is a plus Prior experience working with both Snowflake and Spark in a corporate setting Formal education in Computer Science, Information Technology, or a related field Proven track record of working with cross-functional teams Benefits Work with one of the Big 4's in India Healthy work Environment Work Life Balance
Posted 5 days ago
7.0 - 10.0 years
0 Lacs
Chandigarh
On-site
bebo Technologies is a leading complete software solution provider. bebo stands for 'be extension be offshore'. We are a business partner of QASource, inc. USA[www.QASource.com]. We offer outstanding services in the areas of software development, sustenance engineering, quality assurance and product support. bebo is dedicated to provide high-caliber offshore software services and solutions. Our goal is to 'Deliver in time-every time'. For more details visit our website: www.bebotechnologies.com Let's have a 360 tour of our bebo premises by clicking on below link: https://www.youtube.com/watch?v=S1Bgm07dPmMKey Required Skills: Bachelor's or Master’s degree in Computer Science, Data Science, or related field. 7–10 years of industry experience, with at least 5 years in machine learning roles. Advanced proficiency in Python and common ML libraries: TensorFlow, PyTorch, Scikit-learn. Experience with distributed training, model optimization (quantization, pruning), and inference at scale. Hands-on experience with cloud ML platforms: AWS (SageMaker), GCP (Vertex AI), or Azure ML. Familiarity with MLOps tooling: MLflow, TFX, Airflow, or Kubeflow; and data engineering frameworks like Spark, dbt, or Apache Beam. Strong grasp of CI/CD for ML, model governance, and post-deployment monitoring (e.g., data drift, model decay). Excellent problem-solving, communication, and documentation skills.
Posted 5 days ago
5.0 years
0 Lacs
India
On-site
Job Title: Senior Machine Learning Engineer (Azure ML + Databricks + MLOps) Experience: 5+ years in AI/ML Engineering Employment Type: Full-Time Job Summary: We are looking for a Senior Machine Learning Engineer with strong expertise in Azure Machine Learning and Databricks to lead the development and deployment of scalable AI/ML solutions. You’ll work with cross-functional teams to design, build, and optimize machine learning pipelines that power critical business functions. Key Responsibilities: Design, build, and deploy scalable machine learning models using Azure Machine Learning (Azure ML) and Databricks . Develop and maintain end-to-end ML pipelines for training, validation, and deployment. Collaborate with data engineers and architects to structure data pipelines on Azure Data Lake , Synapse , or Delta Lake . Integrate models into production environments using Azure ML endpoints , MLflow , or REST APIs . Monitor and maintain deployed models, ensuring performance and reliability over time. Use Databricks notebooks and PySpark to process and analyze large-scale datasets. Apply MLOps principles using tools like Azure DevOps , CI/CD pipelines , and MLflow for versioning and reproducibility. Ensure compliance with data governance, security, and responsible AI practices. Required Qualifications: Bachelor’s or Master’s degree in Computer Science, Data Science, Engineering, or a related field. 5+ years of hands-on experience in machine learning or data science roles. Strong proficiency in Python , and experience with libraries like Scikit-learn , XGBoost , PyTorch , or TensorFlow . Deep experience with Azure Machine Learning services (e.g., workspaces, compute clusters, pipelines). Proficient in Databricks , including Spark (PySpark), notebooks, and Delta Lake. Strong understanding of MLOps, experiment tracking, model management, and deployment automation. Experience with data engineering tools (e.g., Azure Data Factory, Azure Data Lake, Azure Synapse). Preferred Skills: Azure certifications (e.g., Azure AI Engineer Associate , Azure Data Scientist Associate ). Familiarity with Kubernetes , Docker , and container-based deployments. Experience working with structured and unstructured data (NLP, time series, image data, etc.). Knowledge of cost optimization , security best practices , and scalability on Azure. Experience with A/B testing, monitoring model drift, and real-time inference. Job Types: Full-time, Permanent Benefits: Flexible schedule Paid sick time Paid time off Provident Fund Work Location: In person
Posted 5 days ago
2.0 - 3.0 years
0 Lacs
Telangana
On-site
Role : ML Engineer (Associate / Senior) Experience : 2-3 Years (Associate) 4-5 Years (Senior) Mandatory Skill: Python/ MLOps/ Docker and Kubernetes/FastAPI or Flask/CICD/Jenkins/Spark/SQL/RDB/Cosmos/Kafka/ADLS/API/Databricks Location: Bangalore Notice Period: less than 60 Days Job Description: Other Skills: Azure /LLMOps / ADF/ETL We are seeking a talented and passionate Machine Learning Engineer to join our team and play a pivotal role in developing and deploying cutting-edge machine learning solutions. You will work closely with other engineers and data scientists to bring machine learning models from proof-of-concept to production, ensuring they deliver real-world impact and solve critical business challenges. Collaborate with data scientists, model developers , software engineers, and other stakeholders to translate business needs into technical solutions. Experience of having deployed ML models to production Create high performance real-time inferencing APIs and batch inferencing pipelines to serve ML models to stakeholders. Integrate machine learning models seamlessly into existing production systems. Continuously monitor and evaluate model performance and retrain the models automatically or periodically Streamline existing ML pipelines to increase throughput. Identify and address security vulnerabilities in existing applications proactively. Design, develop, and implement machine learning models for preferably insurance related applications. Well versed with Azure ecosystem Knowledge of NLP and Generative AI techniques. Relevant experience will be a plus. Knowledge of machine learning algorithms and libraries (e.g., TensorFlow, PyTorch) will be a plus. Stay up-to-date on the latest advancements in machine learning and contribute to ongoing innovation within the team.
Posted 5 days ago
3.0 years
6 - 8 Lacs
Hyderābād
On-site
Welcome to Warner Bros. Discovery… the stuff dreams are made of. Who We Are… When we say, “the stuff dreams are made of,” we’re not just referring to the world of wizards, dragons and superheroes, or even to the wonders of Planet Earth. Behind WBD’s vast portfolio of iconic content and beloved brands, are the storytellers bringing our characters to life, the creators bringing them to your living rooms and the dreamers creating what’s next… From brilliant creatives, to technology trailblazers, across the globe, WBD offers career defining opportunities, thoughtfully curated benefits, and the tools to explore and grow into your best selves. Here you are supported, here you are celebrated, here you can thrive. Your New Role : As the Staff, Solutions Architect (AI/ML) , you will play a pivotal role in driving the adoption of AI and Machine Learning solutions within one of the world’s largest Media & Entertainment companies. Based in India, you will lead the design and implementation of innovative AI/ML architectures that transform how we create, distribute, and monetize content across global markets. Your expertise will help bridge the gap between complex business challenges and advanced technical solutions, ensuring AI initiatives deliver real business value. This is a unique opportunity to work at the intersection of creativity and technology, where you’ll lead the development of intelligent solutions for personalized viewer experiences, optimized content workflows, and data-driven decision-making. If you’re passionate about leveraging AI/ML to revolutionize the Media & Entertainment industry and thrive in a dynamic, collaborative environment, this role is for you. 1. AI/ML Solution Design and Development Design and develop scalable AI/ML solutions tailored to address business challenges such as audience analytics, content personalization, and ad optimization. Lead the end-to-end architecture of AI/ML platforms, ensuring seamless integration with existing systems and data pipelines. Collaborate with data scientists, engineers, and business stakeholders to convert models into production-ready solutions. Evaluate and select appropriate AI/ML frameworks, technologies, and tools to meet project requirements. Ensure AI solutions are optimized for performance, scalability, and reliability across diverse use cases. 2. Technical Leadership and Innovation Provide technical leadership for AI/ML projects, guiding teams through solution architecture, development, and deployment. Stay abreast of emerging AI/ML trends and technologies to introduce innovative solutions and best practices. Lead proof-of-concept (POC) initiatives to validate new AI capabilities and demonstrate their potential impact. Promote a culture of innovation within the team, encouraging experimentation with cutting-edge AI/ML techniques. Mentor junior architects and engineers to build a strong pipeline of AI/ML talent. 3. Collaboration and Stakeholder Engagement Act as a key interface between business units, technical teams, and senior leadership to align AI/ML solutions with organizational goals. Translate business requirements into technical specifications, ensuring clarity and feasibility. Collaborate with cross-functional teams to prioritize and execute AI/ML projects that deliver the highest business impact. Communicate the value and progress of AI/ML initiatives to non-technical stakeholders through clear, compelling narratives. Foster strong relationships with external partners, including technology vendors and academic institutions, to drive innovation. 4. AI Governance and Risk Management Implement best practices for AI/ML governance, including model explainability, accountability, and ethical use. Ensure AI solutions comply with data privacy regulations and internal security protocols. Proactively identify and mitigate risks associated with AI/ML implementations, such as bias, overfitting, or data quality issues. Develop monitoring frameworks to track model performance and retrain models as necessary to maintain effectiveness. Establish guidelines and documentation for AI/ML processes, ensuring consistency and transparency. 5. Scalability and Continuous Improvement Architect solutions that are modular and scalable, capable of supporting future business growth and technological evolution. Regularly review and optimize existing AI/ML systems for improved performance and cost-efficiency. Establish feedback loops to capture learnings from deployed solutions and inform future enhancements. Identify opportunities for automation and operational efficiency using AI/ML. Lead initiatives to streamline workflows and reduce time-to-market for AI/ML projects. Qualifications & Experiences: Academic Qualifications: Bachelor’s or Master’s degree in Computer Science, Data Science, Artificial Intelligence, or a related technical discipline. Specialized certifications in AI/ML (e.g., Google Cloud AI Engineer, AWS Machine Learning Specialty) are a plus. Professional Experience: 8 + years of experience in AI/ML solution architecture, with at least 3+ years in a leadership role. Proven track record of designing and deploying AI/ML solutions in enterprise-scale environments, preferably within Media & Entertainment or a similar industry. Hands-on experience with AI/ML frameworks (e.g., TensorFlow, PyTorch, Scikit-learn) and cloud platforms (AWS, Azure, GCP). Strong expertise in building and optimizing data pipelines, model deployment workflows, and MLOps practices. Experience in implementing AI use cases like recommendation systems, natural language processing (NLP), and computer vision. Technical Skills: Proficiency in programming languages like Python, Java, or R. Expertise in big data technologies (e.g., Spark, Hadoop) and database systems (SQL, NoSQL). Solid understanding of microservices architecture and APIs for AI model integration. Advanced knowledge of AI model lifecycle management, from training to deployment and monitoring. Familiarity with visualization tools (e.g., Tableau, Power BI) to present AI-driven insights. Soft Skills: Exceptional problem-solving and critical-thinking abilities. Strong communication skills with the ability to articulate technical concepts to non-technical audiences. Collaborative mindset with the ability to work effectively in cross-functional teams. Leadership qualities, including mentoring and team development. High adaptability to a fast-paced and dynamic work environment. How We Get Things Done… This last bit is probably the most important! Here at WBD, our guiding principles are the core values by which we operate and are central to how we get things done. You can find them at www.wbd.com/guiding-principles/ along with some insights from the team on what they mean and how they show up in their day to day. We hope they resonate with you and look forward to discussing them during your interview. Championing Inclusion at WBD Warner Bros. Discovery embraces the opportunity to build a workforce that reflects a wide array of perspectives, backgrounds and experiences. Being an equal opportunity employer means that we take seriously our responsibility to consider qualified candidates on the basis of merit, regardless of sex, gender identity, ethnicity, age, sexual orientation, religion or belief, marital status, pregnancy, parenthood, disability or any other category protected by law. If you’re a qualified candidate with a disability and you require adjustments or accommodations during the job application and/or recruitment process, please visit our accessibility page for instructions to submit your request.
Posted 5 days ago
12.0 - 15.0 years
2 - 4 Lacs
Hyderābād
Remote
Join Amgen's Mission to Serve Patients If you feel like you’re part of something bigger, it’s because you are. At Amgen, our shared mission—to serve patients—drives all that we do. It is key to our becoming one of the world’s leading biotechnology companies. We are global collaborators who achieve together—researching, manufacturing, and delivering ever-better products that reach over 10 million patients worldwide. It’s time for a career you can be proud of. Principal IS Architect Live What you will do Let’s do this. Let’s change the world. In this vital role We are seeking a visionary and technically exceptional Principal IS Architect to lead the design and development of enterprise-wide intelligent search solutions. s a senior-level IT professional who designs and oversees the implementation of robust and scalable data and AI solutions, often utilizing the Java programming language and related technologies. This role requires a strong understanding of both data architecture principles and AI/ML concepts, along with expertise in Java development and cloud platforms You’ll lead by example—mentoring engineers, setting standards, and driving the technical vision for our next-generation search capabilities. This person will also be responsible for defining the roadmap for Products They will work closely with Development teams and act as a bridge between Product owners and Development teams to perform Proof of Concepts on provided design and technology, develop re-usable components etc. This is a senior role in the organization which along with a team of other architects will help design the future state of technology at Amgen India Design and Strategy: Responsibilities include developing and maintaining foundational architecture for data and AI initiatives, defining the technical roadmap, and translating business requirements into technical specifications Data Architecture: This involves designing and implementing data models, database designs, and ETL processes, as well as leading the design of scalable data architectures. The role also includes establishing best practices for data management and ensuring data security and compliance. AI Architecture and Implementation: Key tasks include architecting and overseeing the implementation of AI/ML frameworks and solutions, potentially with a focus on generative AI models, and defining processes for AI/ML development and MLOps. Develop end-to-end solution architectures for data-driven and AI-focused applications, ensuring alignment with business objectives and technology strategy. Lead architecture design efforts across data pipelines, machine learning models, AI applications, and analytics platforms in our Gap Data Platform area. Collaborate closely with business partners, product managers, data scientists, software engineers, and the broader Global Technology Solutions teams in vetting solution design and delivering business value. Provide technical leadership and mentoring in data engineering and AI best practices. Evaluate and recommend emerging data technologies, AI techniques, and cloud services to enhance business capabilities. Ensure the scalability, performance, and security of data and AI architectures. Establish and maintain architectural standards, including patterns and guidelines for data and AI projects. Create architecture artifacts(concept, system, data architecture) for data and AI projects/initiatives. Create and oversee architecture center of excellence for data and AI area to coach and mentor resources working in this area. Set technical direction, best practices, and coding standards for search engineering across the organization. Review designs, mentor senior and mid-level engineers, and champion architecture decisions aligned with product goals and compliance needs. Own performance, scalability, observability, and reliability of search services in production. Resolving technical problems as they arise. Providing technical guidance and mentorship to junior developers. Continually researching current and emerging technologies and proposing changes where needed. .Assessing the business impact that certain technical choices have. Providing updates to stakeholders on product development processes, costs, and budgets. Work closely with Information Technology professionals within the company to ensure hardware is available for projects and working properly Work closely with project management teams to successfully monitor progress of initiatives Current understanding of best practices regarding system security measures Positive outlook in meeting challenges and working to a high level Advanced understanding of business analysis techniques and processes Account for possible project challenges on constraints including, risks, time, resources and scope Possesses strong rapid prototyping skills and can quickly translate concepts into working code Take ownership of complex software projects from conception to deployment. Manage software delivery scope, risk, and timeline Participate to both front-end and back-end development using cloud technology. Develop innovative solution using generative AI technologies Define and implement robust software architectures on the cloud, AWS preferred Conduct code reviews to ensure code quality and alignment to best practices. Create and maintain documentation on software architecture, design, deployment, disaster recovery, and operations. Identify and resolve technical challenges effectively. Stay updated with the latest trends and advancements Work closely with product team, business team, and other key partners. Basic Qualifications: Master’s degree in computer science & engineering preferred with 12-15 years of software development experience OR, Bachelor’s degree in computer science & engineering preferred with 11-15 years of software development experience Minimum of 7 years of professional experience in technology, including at least 3 years in a data architecture and AI solution architect role. Strong expertise in cloud platforms, preferably Azure and GCP, and associated data and AI services. Proven experience in architecting and deploying scalable data solutions, including data lakes, warehouses, and streaming platforms. Working knowledge of tools/technologies like Azure Data Factory, Confluent Kafka, Spark, Databricks, BigQuery and Vertex AI. Deep understanding of AI/ML frameworks and tools such as TensorFlow, PyTorch, Spark ML, or Azure ML. Preferred Qualifications: Programming Languages: Proficiency in multiple languages (e.g., Python, Java,Data bricks, Vertex) is crucial and must Experienced with API integration, serverless, microservices architecture. Proficiency with programming languages like Python, Java, or Scala. Proficiency vAzure Data Factory, Confluent Kafka, Spark, Databricks, BigQuery and Vertex AI. Proficiency with AI/ML frameworks and tools such as TensorFlow, PyTorch, Spark ML, or Azure ML Solid understanding of data governance, security, privacy, and compliance standards. Exceptional communication, presentation, and stakeholder management skills. Experience working in agile project environments Good to Have Skills Willingness to work on AI Applications Experience with popular large language models Experience with Langchain or llamaIndex framework for language models Experience with prompt engineering, model fine tuning Knowledge of NLP techniques for text analysis and sentiment analysis Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills. Ability to work effectively with global, remote teams. High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Strong presentation and public speaking skills. Thrive What you can expect of us As we work to develop treatments that take care of others, we also work to care for our teammates’ professional and personal growth and well-being. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now for a career that defies imagination In our quest to serve patients above all else, Amgen is the first to imagine, and the last to doubt. Join us. careers.amgen.com Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation.
Posted 5 days ago
0 years
4 - 10 Lacs
Hyderābād
On-site
India Information Technology (IT) Group Functions Job Reference # 323207BR City Hyderabad Job Type Full Time Your role Are you innovative and passionate about building secure and reliable solutions? We are looking for senior hands-on Tech engineers specializing in either DevSecOps, Data Engineering or Full-Stack web development to lead a team of engineers and provide technical direction in building firmwide Data Observability Components on Azure. We are open to adapting the role suited to your career aspirations and skillset. Responsibilities include: Design/document, develop, review, test, release, support Data Observability components/platforms/environments. Initiate Engineering improvements. Contribute to agile ceremonies e.g. daily stand-ups, backlog refinement, iteration planning, iteration reviews, retrospectives. Comply with the firm’s applicable policies and processes. Collaborate with other teams and divisions using Data Observability services, related guilds and other Data Mesh Services teams. Ensure delivery deadlines are met. Point of escalation for the team to ensure the removal of blockers. Provide leadership and training. Your team You will be part of a diverse global team consisting of data scientists, data engineers, full-stack developers, DevSecOps engineers and knowledge engineers within Group CTO working primarily in a local team with some interactions with other teams and divisions. We are providing Data Observability services as part of our firmwide Data Mesh strategy to automate and scale data management to improve time-to-market for data and reduce data downtime. We provide learning opportunities and a varied technology landscape. Technologies include Azure Cloud, AI (ML and GenAI models), web user interface (React), data storage (Postgres), REST APIs, Kafka, Great Expectations, ontology models. Your expertise Experience in the following (or similar transferrable skills): Learning and implementing industry best practices e.g. finding ways to optimize solutions architecture. Track record in innovation showcasing Engineering or related improvements. Leading and creating effective teams Hands-on delivery in any of the following (or related): full-stack web development (e.g. React, APIs), CI/CD pipelines, security risk mitigation, infrastructure as code (e.g. Terraform), monitoring, Azure development using AKS/ACI/ACA, data transformations, Spark, python, database design and development in any database. Agile software practices and tools, performance testing, unit and integration testing. Identifying root-causes and designing and implementing the solution. Managing infrastructure and environments e.g. Kafka, Spark, web servers, databases etc... Collaborating with other teams to achieve common goals Learning and reskilling in new technologies. About us UBS is the world’s largest and the only truly global wealth manager. We operate through four business divisions: Global Wealth Management, Personal & Corporate Banking, Asset Management and the Investment Bank. Our global reach and the breadth of our expertise set us apart from our competitors. We have a presence in all major financial centers in more than 50 countries. How we hire We may request you to complete one or more assessments during the application process. Learn more Join us At UBS, we know that it's our people, with their diverse skills, experiences and backgrounds, who drive our ongoing success. We’re dedicated to our craft and passionate about putting our people first, with new challenges, a supportive team, opportunities to grow and flexible working options when possible. Our inclusive culture brings out the best in our employees, wherever they are on their career journey. We also recognize that great work is never done alone. That’s why collaboration is at the heart of everything we do. Because together, we’re more than ourselves. We’re committed to disability inclusion and if you need reasonable accommodation/adjustments throughout our recruitment process, you can always contact us. Contact Details UBS Business Solutions SA UBS Recruiting Disclaimer / Policy statements UBS is an Equal Opportunity Employer. We respect and seek to empower each individual and support the diverse cultures, perspectives, skills and experiences within our workforce.
Posted 5 days ago
2.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description We are seeking a passionate and experienced Full Stack AI/ML Engineer with a strong background in machine learning and a drive for building intelligent systems. As a Full-Stack AI/ML Engineer on the Ford Pro Charging team, you will design, build, and ship intelligent services that power our global EV-charging platform. If you love turning data into real-world impact and thrive on end-to-end ownership—from research notebooks to production APIs—this is your playground. Responsibilities Design & Develop AI Solutions: Lead the design, development, training, and evaluation of machine learning models and AI solutions across various domains to enhance our products and services. Identify AI Opportunities: Proactively identify and explore opportunities to apply data-driven solutions to improve existing products, optimize internal processes, and create new value propositions. Model Implementation & Optimization: Implement, optimize, and deploy various machine learning algorithms and deep learning architectures to solve complex problems. Data Management & Engineering: Collaborate with data engineers to ensure robust data collection, preprocessing, feature engineering, and pipeline development for effective model training and performance. Backend Integration: Design and implement robust APIs and services to integrate AI/ML models and solutions seamlessly into our existing backend infrastructure, ensuring scalability, reliability, and maintainability. Performance Monitoring & Improvement: Continuously monitor, evaluate, and fine-tune the performance, accuracy, and efficiency of deployed AI/ML models and systems. Research & Innovation: Stay abreast of the latest advancements in AI, ML, and relevant technologies, and propose innovative solutions to push the boundaries of our product capabilities. Testing & Deployment: Participate in the rigorous testing, deployment, and ongoing maintenance of AI/ML solutions in production environments. Qualifications Required Skills & Qualifications: Experience: 2+ years of professional experience in Artificial Intelligence, Machine Learning, or Data Science roles, with a proven track record of delivering production-grade AI/ML solutions (or equivalent demonstrable expertise). Technical Expertise: Proficiency in Python and strong experience with core AI/ML libraries and frameworks (e.g., TensorFlow, PyTorch, scikit-learn, Hugging Face Transformers). Solid grasp of various machine learning algorithms (supervised, unsupervised, reinforcement learning) and deep learning architectures. Demonstrated experience applying machine learning to complex datasets, including structured and unstructured data. Proficient in API design (REST, GraphQL), microservices, and database design (SQL/NoSQL); production experience on at least one major cloud (AWS, Azure, or GCP). Practical knowledge of Docker, Kubernetes, and CI/CD pipelines (GitHub Actions, Argo, or similar). Problem-Solving: Excellent analytical and problem-solving skills, with proven ability to break down complex problems into iterative experiments and devise effective, scalable AI/ML solutions Enthusiasm & Learning: A genuine passion for technology, coupled with a self-driven commitment to continuous learning and mastery of new techniques. We value individuals who proactively identify challenges, conceptualize solutions, and lead ideation and innovation, beyond mere task execution Communication: Strong communication skills to articulate complex technical concepts to both technical and non-technical stakeholders. Education: Bachelor's or master's degree in computer science, Artificial Intelligence, Machine Learning, or a related quantitative field. Bonus Points: Domain expertise in EV charging, smart-grid, or energy-management systems. Experience with distributed data technologies (Spark, Flink, Kafka Streams). Contributions to open-source ML projects or peer-reviewed publications. Knowledge of ethical and responsible AI frameworks, including bias detection and model explainability.
Posted 5 days ago
2.0 - 4.0 years
6 - 9 Lacs
Hyderābād
On-site
Summary As a Data Analyst, you will be responsible for Design, develop, and maintain efficient and scalable data pipelines for data ingestion, transformation, and storage. About the Role Location – Hyderabad #LI Hybrid About the Role: As a Data Analyst, you will be responsible for Design, develop, and maintain efficient and scalable data pipelines for data ingestion, transformation, and storage. Key Responsibilities: Design, develop, and maintain efficient and scalable data pipelines for data ingestion, transformation, and storage. Collaborate with cross-functional teams, including data analysts, business analyst and BI, to understand data requirements and design appropriate solutions. Build and maintain data infrastructure in the cloud, ensuring high availability, scalability, and security. Write clean, efficient, and reusable code in scripting languages, such as Python or Scala, to automate data workflows and ETL processes. Implement real-time and batch data processing solutions using streaming technologies like Apache Kafka, Apache Flink, or Apache Spark. Perform data quality checks and ensure data integrity across different data sources and systems. Optimize data pipelines for performance and efficiency, identifying and resolving bottlenecks and performance issues. Collaborate with DevOps teams to deploy, automate, and maintain data platforms and tools. Stay up to date with industry trends, best practices, and emerging technologies in data engineering, scripting, streaming data, and cloud technologies Essential Requirements: Bachelor's or Master's degree in Computer Science, Information Systems, or a related field with an overall experience of 2-4 Years. Proven experience as a Data Engineer or similar role, with a focus on scripting, streaming data pipelines, and cloud technologies like AWS, GCP or Azure. Strong programming and scripting skills in languages like Python, Scala, or SQL. Experience with cloud-based data technologies, such as AWS, Azure, or Google Cloud Platform. Hands-on experience with streaming technologies, such as AWS Streamsets, Apache Kafka, Apache Flink, or Apache Spark Streaming. Strong experience with Snowflake (Required) Proficiency in working with big data frameworks and tools, such as Hadoop, Hive, or HBase. Knowledge of SQL and experience with relational and NoSQL databases. Familiarity with data modelling and schema design principles. Strong problem-solving skills and the ability to work in a fast-paced, collaborative environment. Excellent communication and teamwork skills. Commitment to Diversity and Inclusion: Novartis is committed to building an outstanding, inclusive work environment and diverse teams' representative of the patients and communities we serve. Accessibility and accommodation: Novartis is committed to working with and providing reasonable accommodation to individuals with disabilities. If, because of a medical condition or disability, you need a reasonable accommodation for any part of the recruitment process, or in order to perform the essential functions of a position, please send an e-mail to diversityandincl.india@novartis.com and let us know the nature of your request and your contact information. Please include the job requisition number in your message Why Novartis: Helping people with disease and their families takes more than innovative science. It takes a community of smart, passionate people like you. Collaborating, supporting and inspiring each other. Combining to achieve breakthroughs that change patients’ lives. Ready to create a brighter future together? https://www.novartis.com/about/strategy/people-and-culture Join our Novartis Network: Not the right Novartis role for you? Sign up to our talent community to stay connected and learn about suitable career opportunities as soon as they come up: https://talentnetwork.novartis.com/network Benefits and Rewards: Read our handbook to learn about all the ways we’ll help you thrive personally and professionally: https://www.novartis.com/careers/benefits-rewards Division US Business Unit Universal Hierarchy Node Location India Site Hyderabad (Office) Company / Legal Entity IN10 (FCRS = IN010) Novartis Healthcare Private Limited Functional Area Marketing Job Type Full time Employment Type Regular Shift Work No Accessibility and accommodation Novartis is committed to working with and providing reasonable accommodation to individuals with disabilities. If, because of a medical condition or disability, you need a reasonable accommodation for any part of the recruitment process, or in order to perform the essential functions of a position, please send an e-mail to [email protected] and let us know the nature of your request and your contact information. Please include the job requisition number in your message. Novartis is committed to building an outstanding, inclusive work environment and diverse teams' representative of the patients and communities we serve.
Posted 5 days ago
3.0 years
6 - 7 Lacs
Hyderābād
On-site
Job Title: Data Engineer Total Experience: 3+ Years Location: Hyderabad Job Type: Contract Work Mode: On-site Notice Period: Immediate to 15 Days Work Timings: Monday to Friday, 10 am to 7 pm (IST) Interview Process Level 1: HR Screening (Personality Assessment) Level 2: Technical Round Level 3: Final Round (Note: The interview levels may vary) Company Overview Compileinfy Technology Solutions Pvt. Ltd. is a fast-growing IT services and consulting company delivering tailored digital solutions across industries. At Compileinfy, we promote a culture of ownership, critical thinking, and technological excellence. Job Summary We are seeking a highly motivated Data Engineer to join our expanding Data & AI team. This role offers the opportunity to design and develop robust, scalable data pipelines and infrastructure, ensuring the delivery of high-quality, timely, and accessible data throughout the organization. As a Data Engineer, you will collaborate across teams to build and optimize data solutions that support analytics, reporting, and business operations. The ideal candidate combines deep technical expertise, strong communication, and a drive for continuous improvement. Who You Are: Experienced in designing and building data pipelines for ingestion, transformation, and loading (ETL/ELT) of data from diverse sources to data warehouses or lakes. Proficient in SQL and at least one programming language, such as Python, Java, or Scala. Skilled at working with both relational databases (e.g., PostgreSQL, MySQL) and big data platforms (e.g., Hadoop, Spark, Hive, EMR). Competent in cloud environments (AWS, GCP, Azure), data lake, and data warehouse solutions. Comfortable optimizing and managing the quality, reliability, and timeliness of data flows. Ability to translate business requirements into technical specifications and collaborate effectively with stakeholders, including data scientists, analysts, and engineers. Detail-oriented, with strong documentation skills and a commitment to data governance, security, and compliance. Proactive, agile, and adaptable to a fast-paced environment with evolving business needs. What You Will Do: Design, build, and manage scalable ETL/ELT pipelines to ingest, transform, and deliver data efficiently from diverse sources to centralized repositories such as lakes or warehouses. Implement validation, monitoring, and cleansing procedures to ensure data consistency, integrity, and adherence to organizational standards. Develop and maintain efficient database architectures, optimize data storage, and streamline data integration flows for business intelligence and analytics. Work closely with data scientists, analysts, and business users to gather requirements and deliver tailored data solutions supporting business objectives. Document data models, dictionaries, pipeline architectures, and data flows to ensure transparency and knowledge sharing. Implement and enforce data security and privacy measures, ensuring compliance with regulatory requirements and best practices. Monitor, troubleshoot, and resolve issues in data pipelines and infrastructure to maintain high availability and performance. Preferred Qualifications: Bachelor’s or higher degree in Computer Science, Information Technology, Engineering, or a related field. 3-4years of experience in data engineering, ETL development, or related areas. Strong SQL and data modeling expertise with hands-on experience in data warehousing or business intelligence projects. Familiarity with AWS data integration tools (e.g., Glue, Athena), messaging/streaming platforms (e.g., Kafka, AWS MSK), and big data tools (Spark, Databricks). Proficiency with version control, testing, and deployment tools for maintaining code and ensuring best practices. Experience in managing data security, quality, and operational support in a production environment. What You Deliver Comprehensive data delivery documentation (data dictionary, mapping documents, models). Optimized, reliable data pipelines and infrastructure supporting the organization’s analytics and reporting needs. Operations support and timely resolution of data-related issues aligned with service level agreements. Interdependencies / Internal Engagement Actively engage with cross-functional teams to align on requirements, resolve issues, and drive improvements in data delivery, architecture, and business impact. Become a trusted partner in fostering a data-centric culture and ensuring the long-term scalability and integrity of our data ecosystem Why Join Us? At Compileinfy, we value innovation, collaboration, and professional growth. You'll have the opportunity to work on exciting, high-impact projects and be part of a team that embraces cutting-edge technologies. We provide continuous learning and career advancement opportunities in a dynamic, inclusive environment. Perks and Benefits Competitive salary and benefits package Flexible work environment Opportunities for professional development and training A supportive and collaborative team culture Application Process Submit your resume with the subject line: “Data Engineer Application – [Your Name]” to recruitmentdesk@compileinfy.com Job Types: Full-time, Contractual / Temporary Contract length: 12 months Pay: ₹600,000.00 - ₹700,000.00 per year Benefits: Health insurance Provident Fund Work Location: In person
Posted 5 days ago
0 years
7 - 9 Lacs
Hyderābād
On-site
India Information Technology (IT) Group Functions Job Reference # 322748BR City Hyderabad, Pune Job Type Full Time Your role Are you innovative and passionate about building secure and reliable solutions? We are looking for Tech Engineers specializing in either DevSecOps, Data Engineering or Full-Stack web development to join our team in building firmwide Data Observability Components on Azure. We are open to adapting the role suited to your career aspirations and skillset. Responsibilities include: Design/document, develop, review, test, release, support Data Observability components/platforms/environments. Contribute to agile ceremonies e.g. daily stand-ups, backlog refinement, iteration planning, iteration reviews, retrospectives. Comply with the firm’s applicable policies and processes. Collaborate with other teams and divisions using Data Observability services, related guilds and other Data Mesh Services teams. Ensure delivery deadlines are met. Your team You will be part of a diverse global team consisting of data scientists, data engineers, full-stack developers, DevSecOps engineers and knowledge engineers within Group CTO working primarily in a local team with some interactions with other teams and divisions. We are providing Data Observability services as part of our firmwide Data Mesh strategy to automate and scale data management to improve time-to-market for data and reduce data downtime. We provide learning opportunities and a varied technology landscape. Technologies include Azure Cloud, AI (ML and GenAI models), web user interface (React), data storage (Postgres, Azure), REST APIs, Kafka, Great Expectations, ontology models. Your expertise Experience in the following (or similar transferrable skills): Hands-on delivery in any of the following (or related): full-stack web development (e.g. React, APIs), data transformations, Spark, python, database design and development in any database, CI/CD pipelines, security risk mitigation, infrastructure as code (e.g. Terraform), monitoring, Azure development. Agile software practices and tools, performance testing, unit and integration testing. Identifying root-causes and designing and implementing the solution. Collaborating with other teams to achieve common goals. Learning and reskilling in new technologies. About us UBS is the world’s largest and the only truly global wealth manager. We operate through four business divisions: Global Wealth Management, Personal & Corporate Banking, Asset Management and the Investment Bank. Our global reach and the breadth of our expertise set us apart from our competitors. We have a presence in all major financial centers in more than 50 countries. How we hire We may request you to complete one or more assessments during the application process. Learn more Join us At UBS, we know that it's our people, with their diverse skills, experiences and backgrounds, who drive our ongoing success. We’re dedicated to our craft and passionate about putting our people first, with new challenges, a supportive team, opportunities to grow and flexible working options when possible. Our inclusive culture brings out the best in our employees, wherever they are on their career journey. We also recognize that great work is never done alone. That’s why collaboration is at the heart of everything we do. Because together, we’re more than ourselves. We’re committed to disability inclusion and if you need reasonable accommodation/adjustments throughout our recruitment process, you can always contact us. Disclaimer / Policy statements UBS is an Equal Opportunity Employer. We respect and seek to empower each individual and support the diverse cultures, perspectives, skills and experiences within our workforce.
Posted 5 days ago
0 years
0 Lacs
Hyderābād
On-site
India Information Technology (IT) Group Functions Job Reference # 322747BR City Hyderabad Job Type Full Time Your role Are you innovative and passionate about building secure and reliable solutions? We are looking for Tech Engineers specializing in either DevSecOps, Data Engineering or Full-Stack web development to join our team in building firmwide Data Observability Components on Azure. We are open to adapting the role suited to your career aspirations and skillset. Responsibilities include: Design/document, develop, review, test, release, support Data Observability components/platforms/environments. Contribute to agile ceremonies e.g. daily stand-ups, backlog refinement, iteration planning, iteration reviews, retrospectives. Comply with the firm’s applicable policies and processes. Collaborate with other teams and divisions using Data Observability services, related guilds and other Data Mesh Services teams. Ensure delivery deadlines are met. Your team You will be part of a diverse global team consisting of data scientists, data engineers, full-stack developers, DevSecOps engineers and knowledge engineers within Group CTO working primarily in a local team with some interactions with other teams and divisions. We are providing Data Observability services as part of our firmwide Data Mesh strategy to automate and scale data management to improve time-to-market for data and reduce data downtime. We provide learning opportunities and a varied technology landscape. Technologies include Azure Cloud, AI (ML and GenAI models), web user interface (React), data storage (Postgres, Azure), REST APIs, Kafka, Great Expectations, ontology models. Your expertise Experience in the following (or similar transferrable skills): Hands-on delivery in any of the following (or related): full-stack web development (e.g. React, APIs), data transformations, Spark, python, database design and development in any database, CI/CD pipelines, security risk mitigation, infrastructure as code (e.g. Terraform), monitoring, Azure development. Agile software practices and tools, performance testing, unit and integration testing. Identifying root-causes and designing and implementing the solution. Collaborating with other teams to achieve common goals. Learning and reskilling in new technologies. About us UBS is the world’s largest and the only truly global wealth manager. We operate through four business divisions: Global Wealth Management, Personal & Corporate Banking, Asset Management and the Investment Bank. Our global reach and the breadth of our expertise set us apart from our competitors. We have a presence in all major financial centers in more than 50 countries. How we hire We may request you to complete one or more assessments during the application process. Learn more Join us At UBS, we know that it's our people, with their diverse skills, experiences and backgrounds, who drive our ongoing success. We’re dedicated to our craft and passionate about putting our people first, with new challenges, a supportive team, opportunities to grow and flexible working options when possible. Our inclusive culture brings out the best in our employees, wherever they are on their career journey. We also recognize that great work is never done alone. That’s why collaboration is at the heart of everything we do. Because together, we’re more than ourselves. We’re committed to disability inclusion and if you need reasonable accommodation/adjustments throughout our recruitment process, you can always contact us. Disclaimer / Policy statements UBS is an Equal Opportunity Employer. We respect and seek to empower each individual and support the diverse cultures, perspectives, skills and experiences within our workforce.
Posted 5 days ago
0 years
0 Lacs
Hyderābād
On-site
India Information Technology (IT) Group Functions Job Reference # 322746BR City Hyderabad, Pune Job Type Full Time Your role Are you innovative and passionate about building secure and reliable solutions? We are looking for Data Engineers and DevSecOps Engineers to join our team in building the Enterprise Data Mesh at UBS. We are open to adapting the role suited to your career aspirations and skillset. Responsibilities include: Design/document, develop, review, test, release, support Data Mesh components/platforms/environments. Contribute to agile ceremonies e.g. daily stand-ups, backlog refinement, iteration planning, iteration reviews, retrospectives. Comply with the firm’s applicable policies and processes. Collaborate with other teams and divisions using Data Mesh services, related guilds and other Data Mesh Services teams. Ensure delivery deadlines are met. Your team You will be part of a diverse global team consisting of data scientists, data engineers, full-stack developers, DevSecOps engineers and knowledge engineers within Group CTO working primarily in a local team with some interactions with other teams and divisions. We are providing many services as part of our Data Mesh strategy firmwide to automate and scale data management to improve time-to-market for data and reduce data downtime. We provide learning opportunities and a varied technology landscape. Technologies include Azure Cloud, AI (ML and GenAI models), web user interface (React), data storage (Postgres, Azure), REST APIs, Kafka, Great Expectations, ontology models. Your expertise Experience in the following (or similar transferrable skills): Hands-on delivery in any of the following (or related): data transformations, Spark, python, database design and development in any database, CI/CD pipelines, security risk mitigation, infrastructure as code (e.g. Terraform), monitoring, Azure development. Agile software practices and tools, performance testing, unit and integration testing. Identifying root-causes and designing and implementing the solution. Collaborating with other teams to achieve common goals. Learning and reskilling in new technologies. About us UBS is the world’s largest and the only truly global wealth manager. We operate through four business divisions: Global Wealth Management, Personal & Corporate Banking, Asset Management and the Investment Bank. Our global reach and the breadth of our expertise set us apart from our competitors. We have a presence in all major financial centers in more than 50 countries. How we hire We may request you to complete one or more assessments during the application process. Learn more Join us At UBS, we know that it's our people, with their diverse skills, experiences and backgrounds, who drive our ongoing success. We’re dedicated to our craft and passionate about putting our people first, with new challenges, a supportive team, opportunities to grow and flexible working options when possible. Our inclusive culture brings out the best in our employees, wherever they are on their career journey. We also recognize that great work is never done alone. That’s why collaboration is at the heart of everything we do. Because together, we’re more than ourselves. We’re committed to disability inclusion and if you need reasonable accommodation/adjustments throughout our recruitment process, you can always contact us. Disclaimer / Policy statements UBS is an Equal Opportunity Employer. We respect and seek to empower each individual and support the diverse cultures, perspectives, skills and experiences within our workforce.
Posted 5 days ago
1.0 - 4.0 years
6 - 9 Lacs
Hyderābād
On-site
Job description Some careers have more impact than others. If you’re looking for a career where you can make a real impression, join HSBC and discover how valued you’ll be. HSBC is one of the largest banking and financial services organizations in the world, with operations in 62 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of DECISION SCIENCE JUNIOR ANALYST Principal responsibilities To support the Business by providing vital input for strategic planning by the senior management which enables effective decision making along with addressing unforeseen challenges. The team leverages the best of data and analytics capabilities to enable smarter decisions and drive profitable growth. The team supports various domains ranging from Regulatory, Operations, Procurement, Human Resources, and Financial Crime Risk. It provides support to various business groups and the job involves data analysis, model and strategy development & implementation, Business Intelligence, reporting and data management The team addresses range of business problems which cover areas of business growth, improving customer experience, limiting risk exposure, capital quantification, enhancing internal business processes etc. Proactively identify key emerging compliance risks across all RC categories and interface appropriately with other RC teams and senior management. To provide greater understanding of the potential impact and associated consequences / failings of significant new or emerging risks. & provide innovative and effective solutions based on SME knowledge that assists the Business / Function. Proposing, managing and tracking the resolution of subsequent risk management actions. Lead cross-functional projects using advanced data modelling and analysis techniques to discover insights that will guide strategic decisions and uncover optimization opportunities. Against this period of considerable regulatory change and development, and as regulators develop their own understanding of compliance risk management, the role holder must maintain a strong knowledge and understanding of regulatory development and the evolution of the compliance risk framework, risk appetite and risk assessment methodology. Deliver repeatable and scalable analytics through the semi-automation of L1 Financial Crime Risk and Regulatory Compliance Risk Assurance controls testing. Here, Compliance Assurance will develop and run analytics on data sets which will contain personal information such as customer and employee data. Requirements Bachelor’s degree from reputed university in statistics, economics or any other quantitative fields. Fresher with educational background relevant in Data Science or certified in Data science courses 1-4 years of Experience in the field of Automation & Analytics Worked on Proof of Concept or Case study solving complex business problems using data Strong analytical skills with business analysis experience or equivalent. Basic knowledge and understanding of financial-services/ banking-operations is a good to have. Delivery focused, demonstrating an ability to work under pressure and within tight deadlines Basic knowledge of working in Python and other Data Science Tools & in visualization tools such as QlikSense/Other visualization tools. Experience in SQL/ETL tools is an added advantage. Understanding of big data tools: Teradata, Hadoop, etc & adopting cloud technologies like GCP/AWS/Azure is good to have Experience in data science and other machine learning algorithms (For e.g.- Regression, Classification) is an added advantage Basic knowledge in Data Engineering skills – Building data pipelines using modern tools / libraries (Spark or similar). You’ll achieve more at HSBC HSBC is an equal opportunity employer committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and, opportunities to grow within an inclusive and diverse environment. We encourage applications from all suitably qualified persons irrespective of, but not limited to, their gender or genetic information, sexual orientation, ethnicity, religion, social status, medical care leave requirements, political affiliation, people with disabilities, color, national origin, veteran status, etc., We consider all applications based on merit and suitability to the role.” Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. ***Issued By HSBC Electronic Data Processing (India) Private LTD***
Posted 5 days ago
5.0 years
15 - 20 Lacs
Thiruvananthapuram
On-site
Job Description: Designation: Senior Full Stack Developer (Python+ Angular + GCP/AWS/Azure) Qualification: Any UG / PG Degree / Computer / Engineering Graduates Experience: Min. 5+ Years Gender: Male / Female Job Location: Trivandrum / Kochi (KERALA) Job Type: Full Time | Day Shift | Permanent Job | Sat & Sun Week Off Working Time: 12:01 PM to 9:00 PM Project: European client | Shift: Mid Shift (12:01PM TO 9:00PM) | WFO Salary: Rs.15,00,000 to 20,00,000 LPA Introduction We are looking for a Senior Full stack (Python & Angular) Developer who will take ownership of building and maintaining complex backend systems, APIs, and applications using Python and for frontend with Angular Js. Profiles with BFSI - Payment system integrations experience is desired. Responsibilities include: Design, develop, and maintain backend applications, APIs, and services using Python. Write clean, maintainable, and scalable code following industry standards and best practices. Optimize application performance and ensure high availability and scalability. Review code and mentor junior developers to ensure code quality and foster knowledge sharing. Implement unit and integration tests to ensure application robustness. Set up and manage CI/CD pipelines using tools like Jenkins, GitLab CI, or CircleCI . Collaborate with DevOps to deploy applications on cloud platforms, preferably Google Cloud Platform (GCP). Design and build cloud-native applications using APIs, containers, and Kubernetes. Leverage GCP services to develop scalable and efficient solutions. Ensure application security, manage access controls, and comply with data privacy regulations. Work closely with frontend developers, DevOps engineers, and product managers for seamless project delivery. Design, manage, and optimize relational and NoSQL databases (PostgreSQL, MySQL, MongoDB). Monitor application performance using tools like Prometheus, Grafana, or Datadog. Build dynamic, responsive UIs using Angular and JavaScript . Develop and maintain reusable Angular components in collaboration with UX/UI teams . Primary Skills: 5+ years of experience as a Python developer, with a focus on Product development (BE+FE development). Hands on experience in Angular Js. Proven experience in designing and deploying scalable applications and microservices. App Integration experience is prefferd. Python- FastAPI (Flask/Django) API Development (RESTful Services) Cloud Platforms – Google Cloud Platform (GCP)prefferd. Familiarity with database management systems– PostgreSQL, MySQL, MongoDB and ORMs (e.g., SQLAlchemy, Django ORM). Knowledge of CI/CD pipelines – Jenkins, GitLab CI, CircleCI Frontend Development – JavaScript, Angular Code Versioning – Git Testing – Unit & Integration Testing Strong understanding of security principles, authentication (OAuth2, JWT), and data protection. Secondary Skills: Monitoring Tools – Prometheus, Grafana, Datadog Security and Compliance Standards – GDPR, PCI, Soc2 DevOps Collaboration UX/UI Collaboration for Angular components Experience with asynchronous programming (e.g., asyncio, Aiohttp). Experience with big data technologies like Spark or Hadoop. Experience with machine learning libraries (e.g., TensorFlow, PyTorch) is a plus. Job Types: Full-time, Permanent Pay: ₹1,500,000.00 - ₹2,000,000.00 per year Benefits: Health insurance Leave encashment Paid sick time Paid time off Provident Fund Schedule: Day shift Monday to Friday Supplemental Pay: Performance bonus Yearly bonus Work Location: In person
Posted 5 days ago
7.0 years
6 - 9 Lacs
Thiruvananthapuram
On-site
7 - 9 Years 2 Openings Trivandrum Role description Senior Data Engineer – Azure/Snowflake Migration Key Responsibilities Design and develop scalable data pipelines using Snowflake as the primary data platform, integrating with tools like Azure Data Factory, Synapse Analytics, and AWS services. Build robust, efficient SQL and Python-based data transformations for cleansing, enrichment, and integration of large-scale datasets. Lead migration initiatives from AWS-based data platforms to a Snowflake-centered architecture, including: o Rebuilding AWS Glue pipelines in Azure Data Factory or using Snowflake-native ELT approaches. o Migrating EMR Spark jobs to Snowflake SQL or Python-based pipelines. o Migrating Redshift workloads to Snowflake with schema conversion and performance optimization. o Transitioning S3-based data lakes (Hudi, Hive) to Snowflake external tables via ADLS Gen2 or Azure Blob Storage. o Redirecting Kinesis/MSK streaming data to Azure Event Hubs, followed by ingestion into Snowflake using Streams & Tasks or Snowpipe. Support database migrations from AWS RDS (Aurora PostgreSQL, MySQL, Oracle) to Snowflake, focusing on schema translation, compatibility handling, and data movement at scale. Design modern Snowflake lakehouse-style architectures that incorporate raw, staging, and curated zones, with support for time travel, cloning, zero-copy restore, and data sharing. Integrate Azure Functions or Logic Apps with Snowflake for orchestration and trigger-based automation. Implement security best practices, including Azure Key Vault integration and Snowflake role-based access control, data masking, and network policies. Optimize Snowflake performance and costs using clustering, multi-cluster warehouses, materialized views, and result caching. Support CI/CD processes for Snowflake pipelines using Git, Azure DevOps or GitHub Actions, and SQL code versioning. Maintain well-documented data engineering workflows, architecture diagrams, and technical documentation to support collaboration and long-term platform maintainability. Required Qualifications 7+ years of data engineering experience, with 3+ years on Microsoft Azure stack and hands-on Snowflake expertise. Proficiency in: o Python for scripting and ETL orchestration o SQL for complex data transformation and performance tuning in Snowflake o Azure Data Factory and Synapse Analytics (SQL Pools) Experience in migrating workloads from AWS to Azure/Snowflake, including services such as Glue, EMR, Redshift, Lambda, Kinesis, S3, and MSK. Strong understanding of cloud architecture and hybrid data environments across AWS and Azure. Hands-on experience with database migration, schema conversion, and tuning in PostgreSQL, MySQL, and Oracle RDS. Familiarity with Azure Event Hubs, Logic Apps, and Key Vault. Working knowledge of CI/CD, version control (Git), and DevOps principles applied to data engineering workloads. Preferred Qualifications Extensive experience with Snowflake Streams, Tasks, Snowpipe, external tables, and data sharing. Exposure to MSK-to-Event Hubs migration and streaming data integration into Snowflake. Familiarity with Terraform or ARM templates for Infrastructure-as-Code (IaC) in Azure environments. Certification such as SnowPro Core, Azure Data Engineer Associate, or equivalent. Skills Aws,Azure Data Lake,Python About UST UST is a global digital transformation solutions provider. For more than 20 years, UST has worked side by side with the world’s best companies to make a real impact through transformation. Powered by technology, inspired by people and led by purpose, UST partners with their clients from design to operation. With deep domain expertise and a future-proof philosophy, UST embeds innovation and agility into their clients’ organizations. With over 30,000 employees in 30 countries, UST builds for boundless impact—touching billions of lives in the process.
Posted 5 days ago
9.0 years
5 - 10 Lacs
Thiruvananthapuram
On-site
9 - 12 Years 1 Opening Trivandrum Role description Tech Lead – Azure/Snowflake & AWS Migration Key Responsibilities Design and develop scalable data pipelines using Snowflake as the primary data platform, integrating with tools like Azure Data Factory, Synapse Analytics, and AWS services. Build robust, efficient SQL and Python-based data transformations for cleansing, enrichment, and integration of large-scale datasets. Lead migration initiatives from AWS-based data platforms to a Snowflake-centered architecture, including: o Rebuilding AWS Glue pipelines in Azure Data Factory or using Snowflake-native ELT approaches. o Migrating EMR Spark jobs to Snowflake SQL or Python-based pipelines. o Migrating Redshift workloads to Snowflake with schema conversion and performance optimization. o Transitioning S3-based data lakes (Hudi, Hive) to Snowflake external tables via ADLS Gen2 or Azure Blob Storage. o Redirecting Kinesis/MSK streaming data to Azure Event Hubs, followed by ingestion into Snowflake using Streams & Tasks or Snowpipe. Support database migrations from AWS RDS (Aurora PostgreSQL, MySQL, Oracle) to Snowflake, focusing on schema translation, compatibility handling, and data movement at scale. Design modern Snowflake lakehouse-style architectures that incorporate raw, staging, and curated zones, with support for time travel, cloning, zero-copy restore, and data sharing. Integrate Azure Functions or Logic Apps with Snowflake for orchestration and trigger-based automation. Implement security best practices, including Azure Key Vault integration and Snowflake role-based access control, data masking, and network policies. Optimize Snowflake performance and costs using clustering, multi-cluster warehouses, materialized views, and result caching. Support CI/CD processes for Snowflake pipelines using Git, Azure DevOps or GitHub Actions, and SQL code versioning. Maintain well-documented data engineering workflows, architecture diagrams, and technical documentation to support collaboration and long-term platform maintainability. Required Qualifications 9+ years of data engineering experience, with 3+ years on Microsoft Azure stack and hands-on Snowflake expertise. Proficiency in: o Python for scripting and ETL orchestration o SQL for complex data transformation and performance tuning in Snowflake o Azure Data Factory and Synapse Analytics (SQL Pools) Experience in migrating workloads from AWS to Azure/Snowflake, including services such as Glue, EMR, Redshift, Lambda, Kinesis, S3, and MSK. Strong understanding of cloud architecture and hybrid data environments across AWS and Azure. Hands-on experience with database migration, schema conversion, and tuning in PostgreSQL, MySQL, and Oracle RDS. Familiarity with Azure Event Hubs, Logic Apps, and Key Vault. Working knowledge of CI/CD, version control (Git), and DevOps principles applied to data engineering workloads. Preferred Qualifications Extensive experience with Snowflake Streams, Tasks, Snowpipe, external tables, and data sharing. Exposure to MSK-to-Event Hubs migration and streaming data integration into Snowflake. Familiarity with Terraform or ARM templates for Infrastructure-as-Code (IaC) in Azure environments. Certification such as SnowPro Core, Azure Data Engineer Associate, or equivalent. Skills Azure,AWS REDSHIFT,Athena,Azure Data Lake About UST UST is a global digital transformation solutions provider. For more than 20 years, UST has worked side by side with the world’s best companies to make a real impact through transformation. Powered by technology, inspired by people and led by purpose, UST partners with their clients from design to operation. With deep domain expertise and a future-proof philosophy, UST embeds innovation and agility into their clients’ organizations. With over 30,000 employees in 30 countries, UST builds for boundless impact—touching billions of lives in the process.
Posted 5 days ago
5.0 years
12 - 18 Lacs
Thiruvananthapuram
Remote
Job Role : Sr Full Stack Developer (Python+ Angular + GCP/AWS/Azure ) Location : 100% Remote Experience: 5-7 Years Job Type: Contract Contract Duration: 6 months Cost: Upto 90k per month Working Hours:- 12:00 PM to 09:00 PM IST. PRIMARY SKILLS: ➢ Bachelor//'s or Master//'s degree in Computer Science, Information Technology, or a related field. ➢ 5-7 years of experience as a Python developer, with a focus on Product development (BE+FE development). ➢ Hands on experience in Angular Js. ➢ Proven experience in designing and deploying scalable applications and microservices. App Integration experience is prefferd. ➢ Python- FastAPI (Flask/Django) ➢ API Development (RESTful Services) Cloud Platforms – Google Cloud Platform (GCP)prefferd. Familiarity with database management systems– PostgreSQL, MySQL, MongoDB and ORMs (e.g., SQLAlchemy, Django ORM). Knowledge of CI/CD pipelines – Jenkins, GitLab CI, CircleCI Frontend Development – JavaScript, Angular Code Versioning – Git Testing – Unit & Integration Testing ➢ Strong understanding of security principles, authentication (OAuth2, JWT), and data protection. SECONDARY SKILLS(IF ANY): ➢ Monitoring Tools – Prometheus, Grafana, Datadog Security and Compliance Standards – GDPR, PCI, Soc2 DevOps Collaboration UX/UI Collaboration for Angular components ➢ Experience with asynchronous programming (e.g., asyncio, Aiohttp). ➢ Experience with big data technologies like Spark or Hadoop. ➢ Experience with machine learning libraries (e.g., TensorFlow, PyTorch) is a plu
Posted 5 days ago
0 years
3 - 4 Lacs
India
On-site
*Job Summary:* We are seeking a passionate and knowledgeable *Data Science Trainer* to join our team. The ideal candidate will have strong expertise in Python, data science concepts, and modern technologies including AI, ML, NLP, and big data. This role requires delivering high-quality training, conducting workshops and bootcamps, and staying updated with the industry trends --- ### * Key Responsibilities :* * Deliver engaging and practical training sessions on: * Python programming and frameworks like Django and Flask * REST APIs and web integration * SQL and database handling * Data Science fundamentals, Machine Learning & Deep Learning * Natural Language Processing (NLP) and Artificial Intelligence (AI) * Retrieval-Augmented Generation (RAG) and other advanced AI methods * Design course materials, assignments, and real-world projects. * Conduct interactive workshops, webinars, and student bootcamps. * Mentor and guide students on capstone projects and portfolio development. * Evaluate student performance and provide constructive feedback. * Collaborate with the curriculum team to update training content based on industry trends. * Use Git and GitHub to manage and demonstrate version control workflows. * Be open and flexible to learn and integrate new tools and technologies as required. --- ### *Required Skills and Qualifications:* * Proficiency in *Python*, including Django/Flask frameworks. * Hands-on experience with *RESTful APIs* and *SQL*. * Solid understanding of *Big Data* concepts and tools (e.g., Hadoop, Spark is a plus). * In-depth knowledge of *Machine Learning, **Deep Learning, and **NLP* techniques. * Familiarity with *Artificial Intelligence* systems and RAG pipelines. * Comfortable using *version control tools* like Git and platforms like GitHub. * Experience conducting *workshops, **seminars, or **student training programs*. * Excellent communication and presentation skills. * Strong problem-solving skills and a proactive learning mindset. --- ### *Preferred Qualifications:* * Bachelor's or Master’s degree in Computer Science, Data Science, or related field. * Prior teaching, training, or mentorship experience is highly desirable. * Certifications in data science, AI/ML, or related domains are a plus. 1. Training Delivery Deliver classroom and/or live online sessions on data science topics such as: Python for Data Science Data Wrangling with Pandas & NumPy Exploratory Data Analysis & Data Visualization Statistics & Probability Machine Learning Algorithms Supervised and Unsupervised Learning Model Evaluation Techniques Introduction to Deep Learning (optional) Teach tools and platforms like Jupyter Notebook , Google Colab , Scikit-learn , Matplotlib , Seaborn , Tableau/Power BI , etc. 2. Curriculum Development Develop and update training materials, coding exercises, project briefs, and assessments based on current industry standards. Design real-world projects and case studies that enable students to apply their knowledge practically. 3. Student Engagement & Mentorship Provide individual and group mentorship on projects and concept understanding. Conduct regular doubt-clearing sessions and performance reviews. Guide students in building portfolios and preparing for technical interviews. 4. Assessment & Progress Tracking Evaluate student assignments, capstone projects, and provide actionable feedback. Track attendance, participation, and progress reports. Share student performance data with academic coordinators or institute leadership. 5. Continuous Improvement & Collaboration Stay current with advancements in data science, AI/ML, and edtech delivery practices. Collaborate with other trainers, content developers, and placement coordinators. Participate in internal training sessions, hackathons, and academic planning meetings. Job Type: Full-time Pay: ₹25,000.00 - ₹35,000.00 per month Schedule: Day shift Supplemental Pay: Performance bonus Work Location: In person
Posted 5 days ago
3.0 - 5.0 years
2 - 5 Lacs
Gurgaon
On-site
Expedia Group brands power global travel for everyone, everywhere. We design cutting-edge tech to make travel smoother and more memorable, and we create groundbreaking solutions for our partners. Our diverse, vibrant, and welcoming community is essential in driving our success. Why Join Us? To shape the future of travel, people must come first. Guided by our Values and Leadership Agreements, we foster an open culture where everyone belongs, differences are celebrated and know that when one of us wins, we all win. We provide a full benefits package, including exciting travel perks, generous time-off, parental leave, a flexible work model (with some pretty cool offices), and career development resources, all to fuel our employees' passion for travel and ensure a rewarding career journey. We’re building a more open world. Join us. As an Infrastructure Engineer, you will be responsible for the technical design, planning, implementation, and optimization of performance tuning and recovery procedures for critical enterprise systems and applications. You will serve as the technical authority in system administration for complex SaaS, local, and cloud-based environments. Your role is critical in ensuring the high availability, reliability, and scalability of our infrastructure components. You will also be involved in designing philosophies, tools, and processes to enable the rapid delivery of evolving products. In this role you will : Design, configure, and document cloud-based infrastructures using AWS Virtual Private Cloud (VPC) and EC2 instances in AWS. Secure and monitor hosted production SaaS environments provided by third-party partners. Define, document, and manage network configurations within AWS VPCs and between VPCs and data center networks, including firewall, DNS, and ACL configurations. Lead the design and review of developer work on DevOps tools and practices. Ensure high availability and reliability of infrastructure components through monitoring and performance tuning. Implement and maintain security measures to protect infrastructure from threats. Collaborate with cross-functional teams to design and deploy scalable solutions. Automate repetitive tasks and improve processes using scripting languages such as Python, PowerShell, or BASH. Support Airflow DAGs in the Data Lake, utilizing the Spark framework and Big Data technologies. Provide support for infrastructure-related issues and conduct root cause analysis. Develop and maintain documentation for infrastructure configurations and procedures. Administer databases, handle data backups, monitor databases, and manage data rotation. Work with RDBMS and NoSQL systems, leading stateful data migration between different data systems. Experience & Qualifications: Bachelor’s or Master’s degree in Information Science, Computer Science, Business, or equivalent work experience. 3-5 years of experience with Amazon Web Services, particularly VPC, S3, EC2, and EMR. Experience in setting up new VPCs and integrating them with existing networks is highly desirable. Experience in maintaining infrastructure for Data Lake/Big Data systems built on the Spark framework and Hadoop technologies. Experience with Active Directory and LDAP setup, maintenance, and policies. Workday certification is preferred but not required. Exposure to Workday Integrations and Configuration is preferred. Strong knowledge of networking concepts and technologies. Experience with infrastructure automation tools (e.g., Terraform, Ansible, Chef). Familiarity with containerization technologies like Docker and Kubernetes. Excellent problem-solving skills and attention to detail. Strong verbal and written communication skills. Understanding of Agile project methodologies, including Scrum and Kanban, is required. Accommodation requests If you need assistance with any part of the application or recruiting process due to a disability, or other physical or mental health conditions, please reach out to our Recruiting Accommodations Team through the Accommodation Request. We are proud to be named as a Best Place to Work on Glassdoor in 2024 and be recognized for award-winning culture by organizations like Forbes, TIME, Disability:IN, and others. Expedia Group's family of brands includes: Brand Expedia®, Hotels.com®, Expedia® Partner Solutions, Vrbo®, trivago®, Orbitz®, Travelocity®, Hotwire®, Wotif®, ebookers®, CheapTickets®, Expedia Group™ Media Solutions, Expedia Local Expert®, CarRentals.com™, and Expedia Cruises™. © 2024 Expedia, Inc. All rights reserved. Trademarks and logos are the property of their respective owners. CST: 2029030-50 Employment opportunities and job offers at Expedia Group will always come from Expedia Group’s Talent Acquisition and hiring teams. Never provide sensitive, personal information to someone unless you’re confident who the recipient is. Expedia Group does not extend job offers via email or any other messaging tools to individuals with whom we have not made prior contact. Our email domain is @expediagroup.com. The official website to find and apply for job openings at Expedia Group is careers.expediagroup.com/jobs. Expedia is committed to creating an inclusive work environment with a diverse workforce. All qualified applicants will receive consideration for employment without regard to race, religion, gender, sexual orientation, national origin, disability or age.
Posted 5 days ago
5.0 years
1 - 9 Lacs
Gurgaon
On-site
Job Description: Senior Data Developer I Location: Gurugram, India Employment Type: Full-Time Experience Level: Mid to Senior-Level Department: Data & Analytics / IT Job Summary: We are seeking an experienced Data Developer with expertise in Microsoft Fabric, Azure Synapse Analytics, Databricks, and strong SQL development skills. The ideal candidate will work on end-to-end data solutions supporting analytics initiatives across clinical, regulatory, and commercial domains in the Life Sciences industry. Familiarity with Azure DevOps, and relevant certifications such as DP-700 and Databricks Data Engineer Associate/Professional are preferred. Power BI knowledge is highly preferable to support integrated analytics and reporting. Key Responsibilities: Design, develop, and maintain scalable and secure data pipelines using Microsoft Fabric, Azure Synapse Analytics, and Azure Databricks to support critical business processes. Develop curated datasets for clinical, regulatory, and commercial analytics using SQL and PySpark. Create and support dashboards and reports using Power BI (highly preferred). Collaborate with cross-functional stakeholders to understand data needs and translate them into technical solutions. Work closely with ERP teams such as Salesforce.com and SAP S/4HANA to integrate and transform business-critical data into analytic-ready formats. Partner with Data Scientists to enable advanced analytics and machine learning initiatives by providing clean, reliable, and well-structured data. Ensure data quality, lineage, and documentation in accordance with GxP, 21 CFR Part 11, and industry best practices. Use Azure DevOps to manage code repositories, track tasks, and support agile delivery processes. Monitor, troubleshoot, and optimize data workflows for reliability and performance. Contribute to the design of scalable, compliant data models and architecture. Required Qualifications: Bachelor’s or Master’s degree in Computer Science. 5+ years of experience in data development or data engineering roles. Hands-on experience with: Microsoft Fabric (Lakehouse, Pipelines, Dataflows) Azure Synapse Analytics (Dedicated/Serverless SQL Pools, Pipelines) Experience with Azure Data Factory, Apache Spark Azure Databricks (Notebooks, Delta Lake, Unity Catalog) SQL (complex queries, optimization, transformation logic) Familiarity with Azure DevOps (Repos, Pipelines, Boards). Understanding of data governance, security, and compliance in the Life Sciences domain. Certifications (Preferred): Microsoft Certified: DP-700 – Fabric Analytics Engineer Associate Databricks Certified Data Engineer Associate or Professional Preferred Skills: Preferred Skills: Strong knowledge of Power BI (highly preferred) Familiarity with HIPAA, GxP, and 21 CFR Part 11 compliance Experience working with ERP data from Salesforce.com and SAP S/4HANA Exposure to clinical trial, regulatory submission, or quality management data Good understanding of AI and ML concepts Experience working with APIs Excellent communication skills and the ability to collaborate across global teams Location - Gurugram Mode - Hybrid
Posted 5 days ago
2.0 years
4 - 10 Lacs
Gurgaon
On-site
Expedia Group brands power global travel for everyone, everywhere. We design cutting-edge tech to make travel smoother and more memorable, and we create groundbreaking solutions for our partners. Our diverse, vibrant, and welcoming community is essential in driving our success. Why Join Us? To shape the future of travel, people must come first. Guided by our Values and Leadership Agreements, we foster an open culture where everyone belongs, differences are celebrated and know that when one of us wins, we all win. We provide a full benefits package, including exciting travel perks, generous time-off, parental leave, a flexible work model (with some pretty cool offices), and career development resources, all to fuel our employees' passion for travel and ensure a rewarding career journey. We’re building a more open world. Join us. Why Join Us? Are you an technologist who is passionate about building robust, scalable, and performant applications & data products? This is exactly what we do, join Data Engineering & Tooling Team! Data Engineering & Tooling Team (part of Enterprise Data Products at Expedia) is responsible for making traveler, partner & supply data accessible, unlocking insights and value! Our Mission is to build and manage the travel industry's premier Data Products and SDKs. Software Development Engineer II Introduction to team Our team is looking for an Software Engineer who applies engineering principles to build & improve existing systems. We follow Agile principles, and we're proud to offer a dynamic, diverse and collaborative environment where you can play an impactful role and build your career. Would you like to be part of a Global Tech company that does Travel? Don't wait, Apply Now! In this role, you will - Implement products and solutions that are highly scalable with high-quality, clean, maintainable, optimized, modular and well-documented code across the technology stack. [OR - Writing code that is clean, maintainable, optimized, modular.] Crafting API's, developing and testing applications and services to ensure they meet design requirements. Work collaboratively with all members of the technical staff and other partners to build and ship outstanding software in a fast-paced environment. Applying knowledge of software design principles and Agile methodologies & tools. Resolve problems and roadblocks as they occur with help from peers or managers. Follow through on details and drive issues to closure. Assist with supporting production systems (investigate issues and working towards resolution). Experience and qualifications: Bachelor's degree or Masters in Computer Science & Engineering, or a related technical field; or equivalent related professional experience. 2+ years of software development or data engineering experience in an enterprise-level engineering environment. Proficient with Object Oriented Programming concepts with a strong understanding of Data Structures, Algorithms, Data Engineering (at scale), and Computer Science fundamentals. Experience with Java, Scala, Spring framework, Micro-service architecture, Orchestration of containerized applications along with a good grasp of OO design with strong design patterns knowledge. Solid understanding of different API types (e.g. REST, GraphQL, gRPC), access patterns and integration. Prior knowledge & experience of NoSQL databases (e.g. ElasticSearch, ScyllaDB, MongoDB). Prior knowledge & experience of big data platforms, batch processing (e.g. Spark, Hive), stream processing (e.g. Kafka, Flink) and cloud-computing platforms such as Amazon Web Services. Knowledge & Understanding of monitoring tools, testing (performance, functional), application debugging & tuning. Good communication skills in written and verbal form with the ability to present information in a clear and concise manner. Accommodation requests If you need assistance with any part of the application or recruiting process due to a disability, or other physical or mental health conditions, please reach out to our Recruiting Accommodations Team through the Accommodation Request. We are proud to be named as a Best Place to Work on Glassdoor in 2024 and be recognized for award-winning culture by organizations like Forbes, TIME, Disability:IN, and others. Expedia Group's family of brands includes: Brand Expedia®, Hotels.com®, Expedia® Partner Solutions, Vrbo®, trivago®, Orbitz®, Travelocity®, Hotwire®, Wotif®, ebookers®, CheapTickets®, Expedia Group™ Media Solutions, Expedia Local Expert®, CarRentals.com™, and Expedia Cruises™. © 2024 Expedia, Inc. All rights reserved. Trademarks and logos are the property of their respective owners. CST: 2029030-50 Employment opportunities and job offers at Expedia Group will always come from Expedia Group’s Talent Acquisition and hiring teams. Never provide sensitive, personal information to someone unless you’re confident who the recipient is. Expedia Group does not extend job offers via email or any other messaging tools to individuals with whom we have not made prior contact. Our email domain is @expediagroup.com. The official website to find and apply for job openings at Expedia Group is careers.expediagroup.com/jobs. Expedia is committed to creating an inclusive work environment with a diverse workforce. All qualified applicants will receive consideration for employment without regard to race, religion, gender, sexual orientation, national origin, disability or age.
Posted 5 days ago
3.0 - 5.0 years
0 Lacs
Gurgaon
On-site
JLL supports the Whole You, personally and professionally. Our people at JLL are shaping the future of real estate for a better world by combining world class services, advisory and technology to our clients. We are committed to hiring the best, most talented people in our industry; and we support them through professional growth, flexibility, and personalized benefits to manage life in and outside of work. Whether you’ve got deep experience in commercial real estate, skilled trades, and technology, or you’re looking to apply your relevant experience to a new industry, we empower you to shape a brighter way forward so you can thrive professionally and personally. Senior Finance Analyst - Accounts Receivable What this job involves: Responsibilities: Financial Analysis and Reconciliation: Analyze and reconcile cash/amounts received in Bank Accounts and Lockboxes. Investigate clients' aging history against over/short payments. Perform AR to GL reconciliations. Query Management and Resolution: Handle and resolve incoming queries promptly. Follow up on pending queries and escalate unresolved issues. Payment Processing and Error Management: Research and analyze duplicate and erroneous payments. Collaborate with Bank and Treasury teams to reconcile errors. Analyze reports for System Auto Applications to ensure proper applications. Reporting and Documentation: Prepare various financial reports including Monthly Balance Sheets, KPIs, and Quarterly Reports. Maintain and update process-related documents in real-time. Process Improvement and Quality Assurance: Identify tactical and strategic opportunities, gaps, and financial risks. Perform root cause analysis to drive process improvements. Conduct quality checks to ensure accurate application of deposits. Team Support and Leadership: Assist in training new employees. Assign work and manage workload distribution within the team. Ensure service delivery meets agreed norms and SLAs. Stakeholder Management: Liaise with Onshore Finance team and other stakeholders. Provide assistance during internal/external audits. Support cross-functional processes as required. Month-End Activities: Undertake month-end closing activities and reporting. Performance Objectives: Work within established procedures with minimal supervision. Demonstrate sound decision-making skills in various situations. Meet deadlines through effective task prioritization. Exhibit flexibility in job responsibilities as priorities change. Contribute to a diverse, collaborative, and driven professional environment. Requirements: Education and Experience: Graduate Degree in Accounting or relevant professional accountancy qualification. 3-5 years of accounting experience in a corporate environment (for external candidates). Min 18 Months in current role (for internal candidates) Skills and Abilities: Strong analytical and problem-solving skills. Excellent oral and written communication skills. Proficiency in financial software and MS Office suite. Ability to work independently and as part of a team. Strong attention to detail and accuracy. Ability to multi-task and work in a fast-paced environment. Knowledge: Understanding of real estate fundamentals. Familiarity with accounting principles and practices. Knowledge of accounts receivable processes and best practices. Personal Attributes: Proactive and creative approach to work. Energetic and enthusiastic attitude. Flexibility to adapt to changing priorities. Commitment to client service. Additional Requirements: Ability to work overtime when required. Open to working in any shift. What we can do for you: At JLL, we make sure that you become the best version of yourself by helping you realize your full potential in an entrepreneurial and inclusive work environment. We will empower your ambitions through our dedicated Total Rewards Program, competitive pay and benefits package. Apply today! If this job description resonates with you, we encourage you to apply even if you don’t meet all of the requirements below. We’re interested in getting to know you and what you bring to the table! Personalized benefits that support personal well-being and growth: JLL recognizes the impact that the workplace can have on your wellness, so we offer a supportive culture and comprehensive benefits package that prioritizes mental, physical and emotional health. About JLL – We’re JLL—a leading professional services and investment management firm specializing in real estate. We have operations in over 80 countries and a workforce of over 102,000 individuals around the world who help real estate owners, occupiers and investors achieve their business ambitions. As a global Fortune 500 company, we also have an inherent responsibility to drive sustainability and corporate social responsibility. That’s why we’re committed to our purpose to shape the future of real estate for a better world. We’re using the most advanced technology to create rewarding opportunities, amazing spaces and sustainable real estate solutions for our clients, our people, and our communities. Our core values of teamwork, ethics and excellence are also fundamental to everything we do and we’re honored to be recognized with awards for our success by organizations both globally and locally. Creating a diverse and inclusive culture where we all feel welcomed, valued and empowered to achieve our full potential is important to who we are today and where we’re headed in the future. And we know that unique backgrounds, experiences and perspectives help us think bigger, spark innovation and succeed together.
Posted 5 days ago
7.0 years
0 Lacs
Gurgaon
On-site
JLL supports the Whole You, personally and professionally. Our people at JLL are shaping the future of real estate for a better world by combining world class services, advisory and technology to our clients. We are committed to hiring the best, most talented people in our industry; and we support them through professional growth, flexibility, and personalized benefits to manage life in and outside of work. Whether you’ve got deep experience in commercial real estate, skilled trades, and technology, or you’re looking to apply your relevant experience to a new industry, we empower you to shape a brighter way forward so you can thrive professionally and personally. Job Description Country Facilities Management Lead This position is responsible and accountable for providing outstanding Service Delivery across the Country Portfolio and across all Work Dynamics functions. Ensuring activities are as safe, operationally sound, human-centric, and engagement-focused as possible, with team management, site operations, service contracts, sourcing, procurement, and finance underpinning high levels of stakeholder satisfaction. The key focus of this role is the ability to develop and enhance local services whilst supporting the strategic intent of regional goals and initiatives, engaging not only with the client’s real estate teams but also other service partners, along with the lines of business and occupants as the ultimate service recipients. Roles and Responsibilities The key responsibilities of this role include: Operations Management Develop operational procedures and performance measures to ensure simplification and accuracy of work methods, reliability of systems, and consistency across the portfolio Actively engage with the workstream leaders to ensure that all site financial operations are meeting or exceeding targets and financial processes as well as controls are adhered to at all times. Ensure compliance with JLL and client Health, Safety, Environment, and Risk Management policies and procedures Ensure data integrity of all systems across the portfolio and perform audits from time to time Client/Stakeholder Management Pro-actively manage and develop Client relationships, acting as a Property Management “Trusted Partner” establishing shared goals and ensuring that expected service levels are achieved Contribute to the Annual Account Plan – aligning knowledge of Client business and driving factors with service requirements Comply with all requirements of the Client contract and meet or exceed Key Performance Indicators Deliver an exceptional quality of service to the Client, as reflected by Client feedback Leadership / Staff Management Actively encourage an environment that supports teamwork, co-operation, performance excellence and personal success Develop the team through performance assessments and training, managing staff workload through correct resourcing and developing a succession plan for key team members and on-site Vendors Develop existing and bring in new talent and capabilities into the team Develop an active and visible team who are highly proactive, responsive, dynamic and agile. Build a seamless team across the various functions to support the Clients strategic goals. Competencies The ideal candidate should have demonstrated the following competencies: Excellent Stakeholder management, able to demonstrate the ability to engage and discuss on strategic matters and high-level operations without delving into the weeds. Strong leadership skills - Ability to demonstrate harnessing the “hearts and minds” of teams to deliver on a vision through to execution Able to adapt and respond in a fast-paced working environment and versatile in meeting client changing needs and requirements Experience within relevant facilities management operational environments, able to understand Critical Infrastructure and Risk Management Experience and Qualifications A minimum of 7 years’ experience across Property Management, including Facilities Management, Project Management and Hospitality Services experience An added benefit would be a Bachelor’s degree in facilities management, building, business or other related field; however, this is not a must. Strong communicator – Good presentation skills and possesses strong verbal & written communication skills (English & local language), also an active listener Passion for quality – has an eye for detail to ensure the best delivery of services Self-motivated; confident & energetic Ability to effectively deal with stressful situations Flexible – able to adapt to rapidly changing situations Strongly goal-oriented – able to focus on meeting all performance targets Is a team player – able to cooperate and work well with others to meet targets Proven ability to initiate and follow through with improvement initiatives Exhibits honesty & trustworthiness Open to new ideas & willing to challenge status quo If this job description resonates with you, we encourage you to apply even if you don’t meet all of the requirements below. We’re interested in getting to know you and what you bring to the table! Personalized benefits that support personal well-being and growth: JLL recognizes the impact that the workplace can have on your wellness, so we offer a supportive culture and comprehensive benefits package that prioritizes mental, physical and emotional health. About JLL – We’re JLL—a leading professional services and investment management firm specializing in real estate. We have operations in over 80 countries and a workforce of over 102,000 individuals around the world who help real estate owners, occupiers and investors achieve their business ambitions. As a global Fortune 500 company, we also have an inherent responsibility to drive sustainability and corporate social responsibility. That’s why we’re committed to our purpose to shape the future of real estate for a better world. We’re using the most advanced technology to create rewarding opportunities, amazing spaces and sustainable real estate solutions for our clients, our people, and our communities. Our core values of teamwork, ethics and excellence are also fundamental to everything we do and we’re honored to be recognized with awards for our success by organizations both globally and locally. Creating a diverse and inclusive culture where we all feel welcomed, valued and empowered to achieve our full potential is important to who we are today and where we’re headed in the future. And we know that unique backgrounds, experiences and perspectives help us think bigger, spark innovation and succeed together.
Posted 5 days ago
2.0 - 4.0 years
0 Lacs
Gurgaon
On-site
JLL supports the Whole You, personally and professionally. Our people at JLL are shaping the future of real estate for a better world by combining world class services, advisory and technology to our clients. We are committed to hiring the best, most talented people in our industry; and we support them through professional growth, flexibility, and personalized benefits to manage life in and outside of work. Whether you’ve got deep experience in commercial real estate, skilled trades, and technology, or you’re looking to apply your relevant experience to a new industry, we empower you to shape a brighter way forward so you can thrive professionally and personally. Role Purpose The successful candidates will be responsible for supporting managers in achieving service excellence and positive outcomes for our clients; showing high levels of technical capability, sound commercial knowledge and a good understanding of the key drivers or cost and value; capturing and sharing knowledge and driving innovation in service. Successful candidates will be presented with a great opportunity for career progression whilst at the same time being exposed to cross sector experience. What this job involves Provide support to Team Leaders, Service Leaders in the delivery of real estate led developments; carry day to day delivery responsibility and demonstrate the ability to take on tasks with minimal supervision. Deliver all work outputs in an accurate and timely manner. Utilize and embed JLL best practice tools and processes including the use of technology to support delivery. Be able to interpret a brief from a client or senior manager and convert into a action plan. Demonstrate the ability to work as team player to deliver Cost Management assignments. Understand and comply with business risk and project delivery parameters including compliance in respect scope of service agreed by others. Capture and share knowledge and be involved with the development of service improvement and innovation as part of the JLL way. Be a strong team player but demonstrate the ability to take a advanced role as part of personal development planning. Always represent the company in a professional and diligent manner. Desired skills and experience for this Approx. 2-4 years of experience Prior experience working in the cost management field. Working knowledge of CostX would be an advantage. Some fit-out experience would be desirable. Degree in related subject (BE / B.Tech / Diploma - Mechanical / Electrical) If this job description resonates with you, we encourage you to apply even if you don’t meet all of the requirements below. We’re interested in getting to know you and what you bring to the table! Personalized benefits that support personal well-being and growth: JLL recognizes the impact that the workplace can have on your wellness, so we offer a supportive culture and comprehensive benefits package that prioritizes mental, physical and emotional health. About JLL – We’re JLL—a leading professional services and investment management firm specializing in real estate. We have operations in over 80 countries and a workforce of over 102,000 individuals around the world who help real estate owners, occupiers and investors achieve their business ambitions. As a global Fortune 500 company, we also have an inherent responsibility to drive sustainability and corporate social responsibility. That’s why we’re committed to our purpose to shape the future of real estate for a better world. We’re using the most advanced technology to create rewarding opportunities, amazing spaces and sustainable real estate solutions for our clients, our people, and our communities. Our core values of teamwork, ethics and excellence are also fundamental to everything we do and we’re honored to be recognized with awards for our success by organizations both globally and locally. Creating a diverse and inclusive culture where we all feel welcomed, valued and empowered to achieve our full potential is important to who we are today and where we’re headed in the future. And we know that unique backgrounds, experiences and perspectives help us think bigger, spark innovation and succeed together.
Posted 5 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France