Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
0 years
0 - 1 Lacs
Bengaluru, Karnataka, India
On-site
Wirality is a modern-day advertising agency, Infused with the DNA of a digital-first environment and an entrepreneurial spirit. Our fundamental belief is that there are brands and consumers, and then there is the internet, that connects the two like a bridge. We operate on this bridge, bridging the gap between the two by creating relevant cultural conversations. We achieve this through our philosophy of ART X MATH, an integrated approach between digital creative & media, helping us deliver a higher ROI. Being an independent agency affords us the freedom to be bold and stray from convention. This Is Where You Come In Can use their creative craft, understand audience sentiments and grasp today's culture to solve creative problems for a variety of brands on social and digital platforms. Understands the difference between brand building and tactical execution. Enjoys having the most challenging role in the creative department as duties under this role are split between both doing the work as well as managing the team Will spend roughly 50% of the time working on solving briefs and 50% of time managing projects. Understands latest digital platforms and how the algorithm works on that end. As a creative lead, understands the value of a strong content strategy & visual guideline, you make sure the work you produce is grounded in insights + creativity and are effectively communicated Someone who is comfortable when it comes to video production and can work with the video team. Knows how to develop ideas that are responsible in terms of timeline and budget You must collaborate effectively with members of the team to get the best product possible (though we ensure work never piles up) As a leader of a creative unit, you must understand that you need to maintain the standard and be an example to the team. A positive attitude is more important than your creative skill or the work you produce. Has experience of working with Paid Social Core Qualifications Include Exceptional writing, video or design skills Ability to conceptualize Comfortable with client interaction Natural leadership tendencies Experience working with all formats of social media and digital content Proven social media understanding A curious researcher Exceptional ability to plan work and manage teams Other Qualifications Include Loves TV and cinema Good collaborator Great with feedback and revisions Consistency in work Time management A good sense of humour and wit Be proactive, be a leader. Other Requirements Ability to commute to work Get us all tea (Just kidding, we drink coffee) We pride ourselves on being a human-first company and want to make this home for everyone who works with us. The Platinum Rules For Working Here Are Solution over problem Be collaborative Honey > Vinegar Our Hiring Process Think of this as a mini reality show—minus the drama, plus a lot more Wi-Fi. Resume Shortlisting We scan your resume like a hawk (with glasses), looking for experience, spark, and signs of caffeine addiction. Screening Call Our HR team will give you a ring to chat about your experience, vibe-check your energy, and confirm you're not a cat using a keyboard. Initial Interview You’ll meet our hiring panel. Expect deep dives into your work, a few “what would you do if…” questions, and some laughs. Assignment Round Time to show us you walk the talk. You'll get a small task to prove your chops. Think of it as your creative entrance exam. Final Interview A chat with our founder—no pressure. Just bring your A-game and be real. (Bonus points if you make them smile.) Offer & Negotiation If we’re all feeling the love, we’ll talk numbers, benefits, and everything else that makes this official. Note: Due to overwhelming responses in the past, only shortlisted candidates will be responded to. Note: This is a unpaid internship.Skills: conceptualization,client interaction,design,creative craft,audience sentiment analysis,writing,digital content,tactical execution,visual guidelines,research,content strategy,team management,social media,brand building,time management,writing brief,video production
Posted 1 day ago
12.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Senior Software Engineer – Backend (Python) 📍 Location: Hyderabad (Hybrid) 🕒 Experience: 5 – 12 years About the Role: We are looking for a Senior Software Engineer – Backend with strong expertise in Python and modern big data technologies. This role involves building scalable backend solutions for a leading healthcare product-based company. Key Skills: Programming: Python, Spark-Scala, PySpark (PySpark API) Big Data: Hadoop, Databricks Data Engineering: SQL, Kafka Strong problem-solving skills and experience in backend architecture Why Join? Hybrid work model in Hyderabad Opportunity to work on innovative healthcare products Collaborative environment with modern tech stack Keywords for Search: Python, PySpark, Spark, Spark-Scala, Hadoop, Databricks, Kafka, SQL, Backend Development, Big Data Engineering, Healthcare Technology
Posted 1 day ago
4.0 years
0 Lacs
Bengaluru East, Karnataka, India
On-site
About Us We are a global leader in food & beverage ingredients. Pioneers at heart, we operate at the forefront of consumer trends to provide food & beverage manufacturers with products and ingredients that will delight their consumers. Making a positive impact on people and planet is all part of the delight. With a deep-rooted presence in the countries where our ingredients are grown, we are closer to farmers, enabling better quality, and more reliable, traceable and transparent supply. Supplying products and ingredients at scale is just the start. We add value through our unique, complementary portfolio of natural, delicious and nutritious products. With our fresh thinking, we help our customers unleash the sensory and functional attributes of cocoa, coffee, dairy, nuts and spices so they can create naturally good food & beverage products that meet consumer expectations. And whoever we’re with, whatever we’re doing, we always make it real . Introduction At ofi, we are at the forefront of harnessing cutting-edge technology to revolutionize our operations. We aim to leverage machine learning and artificial intelligence to drive transformative business outcomes and create value for our clients. We are committed to a culture of innovation, diversity, and continuous improvement, where every team member can contribute and thrive. As a ML Engineer, you will be crucial in developing advanced algorithms and models to tackle complex problems. Your expertise will drive the deployment and upkeep of intelligent systems that enhance our products and services. You will work within a collaborative environment, leveraging data and machine learning to influence business strategies and improve operational efficiency. Key Deliverables Deliver end-to-end ML solutions: Architect and implement state-of-the-art models—classification, regression, clustering, reinforcement learning—precisely tuned to solve high-value business problems. Engineer data & experimentation pipelines at scale: Build reliable, self-service pipelines for ingesting, cleaning, transforming, and aggregating data, and orchestrate rigorous offline/online experiments (cross-validation, A/B tests) to benchmark accuracy, latency, and resource cost. Embed ML seamlessly into products: Partner with data scientists, backend/frontend engineers, and designers to wire models into production services and user experiences, ensuring low-friction integration and measurable product impact. Operate, monitor, and evolve models in production: Own the DevOps stack—automated CI/CD, containerization, and cloud deployment—and run real-time monitoring to detect drift, performance degradation, and anomalies, triggering retraining or rollback as needed. Uphold engineering excellence & knowledge sharing: Enforce rigorous code quality, version control, testing, and documentation; lead code reviews and mentoring sessions that raise the team’s ML craftsmanship. Safeguard data privacy, security, and compliance: Design models and pipelines that meet regulatory requirements, apply robust access controls and encryption, and audit usage to ensure ethical and secure handling of sensitive data. Qualification & Skills Formal grounding in computing & AI: Bachelor’s / Master’s in Computer Science, Data Science, or a related quantitative field. Proven production experience: 4+ years shipping, deploying, and maintaining machine-learning models at scale, with a track record of solving complex, real-world problems. End-to-end technical toolkit: Python (Pandas, NumPy), ML frameworks (TensorFlow, PyTorch, scikit-learn), databases (SQL & NoSQL), and big-data stacks (Spark, Hadoop). MLOps & cloud deployment mastery: Containerization (Docker, Kubernetes), CI/CD pipelines, and monitoring workflows that keep models reliable and reproducible in production. Deep applied-ML expertise: Supervised and unsupervised learning, NLP, computer vision, and time-series analysis, plus strong model-evaluation and feature-engineering skills. Collaboration & communication strength: Clear communicator and effective team player who can translate business goals into technical solutions and articulate results to diverse stakeholders. ofi is an equal opportunity/affirmative action employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, nationality, disability, protected veteran status, sexual orientation, gender identity, gender expression, genetic information, or any other characteristic protected by law. Applicants are requested to complete all required steps in the application process including providing a resume/CV in order to be considered for open roles.
Posted 1 day ago
7.0 years
0 Lacs
India
Remote
Description Demand Generation Manager India, Remote EGNYTE YOUR CAREER. SPARK YOUR PASSION. Role Egnyte is a place where we spark opportunities for amazing people. We believe that every role has a great impact , and every Egnyter should be respected. When joining Egnyte, you’re not just landing a new career, you become part of a team of Egnyters that are doers, thinkers, and collaborators who embrace and live by our values: Invested Relationships Fiscal Prudence Candid Conversations About Egnyte Egnyte is the secure multi-cloud platform for content security and governance that enables organizations to better protect and collaborate on their most valuable content. Established in 2008, Egnyte has democratized cloud content security for more than 22,000 organizations, helping customers improve data security, maintain compliance, prevent and detect ransomware threats, and boost employee productivity on any app, any cloud, anywhere. For more information, visit www.egnyte.com . Our GTM Strategy Team is the driving force behind the seamless functioning of go to market initiatives within the organization. Tasked with optimizing processes and leveraging technology, this team ensures the efficient delivery of GTM programs. By analyzing data, implementing effective tools, and collaborating across departments, the GTM Strategy team contributes to the enhancement of sales experiences and the overall success of the organization. Their strategic planning and cross-functional coordination play a critical role in not only retaining customers but also fostering growth and ensuring the continual delivery of value to customers through products or services. What You’ll Do Create materials to communicate strategic plans Analyze and manage data-driven initiatives to drive revenue growth Monitor and report on key performance metrics Identify and recommend new revenue strategies Research market trends and the competitive landscape to create recommendations for strategic pivots Partner with finance, marketing, and sales leaders to help create annual revenue plans Your Qualifications WHO YOU ARE: Knowledgeable, Analytical, and Intellectual 7 years’ experience at top tier consulting firm (e.g., Mckinsey , Bain, BCG, Deloitte) You are a problem-solver who can take the initiative to develop and implement innovative solutions You’ve got strong quantitative skills and are comfortable analyzing data sets, spotting trends and synthesizing relevant observations You like thinking outside the box to come up with innovative points of view Basic knowledge of Tableau, Salesforce, and SQL a plus Benefits Competitive salaries Medical insurance and healthcare benefits for you and your family Fully paid premiums for life insurance Flexible hours and PTO Mental wellness platform subscription Gym reimbursement Childcare reimbursement Group term life insurance Commitment To Diversity, Equity, And Inclusion At Egnyte, we celebrate our differences and thrive on our diversity for our employees, our products, our customers, our investors, and our communities. Egnyters are encouraged to bring their whole selves to work and to appreciate the many differences that collectively make Egnyte a higher-performing company and a great place to be.
Posted 1 day ago
0 years
0 Lacs
India
Remote
Established in 2004, OLIVER is the world’s first and only specialist in designing, building, and running bespoke in-house agencies and marketing ecosystems for brands. We partner with over 300 clients in 40+ countries and counting. Our unique model drives creativity and efficiency, allowing us to deliver tailored solutions that resonate deeply with audiences. As a part of The Brandtech Group , we're at the forefront of leveraging cutting-edge AI technology to revolutionise how we create and deliver work. Our AI solutions enhance efficiency, spark creativity, and drive insightful decision-making, empowering our teams to produce innovative and impactful results. Job Title: Digital Designer (Motion) Role: Freelancer Duration: 5 months Location: India (Remote) About the role: OLIVER is looking to recruit a Digital Designer to work on-site with one of our key clients. The ideal candidate will have a strong integrated design background, with a deep knowledge of digital first advertising and creative. Good in After Affects or basic animation and video editing is a must. Reporting into the Design Team Lead for creative work related, the candidate will be partnering with the Lead in producing digital concepts and design to the client’s brief and exacting standards whilst positively influencing clients with their creative input in addition to undertaking and pitching new creative concepts. The candidate will be working on an account around all things digital design including, social media, E-commerce, creative ideation, artworking and offline design collateral. What you will be doing: Responsible for brand consistency across all outputs. Experience in CRM, digital and offline is desired. Producing short-form mobile first innovative digital content for the client’s websites, digital applications and social media channels Ability to work independently from creative concept to execution To be accountable for the work by the creative team ensuring all work are as per brand guideline and platform’s best practises. Together with the Design Team Lead, the candidate will work actively with all internal and external stakeholders to ensure the delivery of the highest level of client service – from brief, creative, design and production To work closely with the Design Team Lead to create strong concepts from the initial briefing To assist the Design Team Lead in pitching creative solutions in response to marketing strategies To manage the preparation of all finished artwork files that will comply with the correct output specifications To ensure all design work adheres to the best practises of digital and social trends and requirements Resourcing and scheduling of you own work Managing projects deliverables and key deadlines Supporting with BAU Design work Quality control Client relations, alternative point of contact on-site, supporting the Design Team Lead and Account Manager in the day to day relationships with key stakeholder Creative and quality oversight for work produced locally. Work with key clients to deliver the following types of projects: Social media and E-Commerce specific like Facebook, Lazada and YouTube On-site design updates (mostly posters, icons, logos and presentation slides) Support on visual identity and tone of voice for campaign materials including POS and OOH Merchandise design and production Support on one-off projects (eg brand day, anniversaries, social activities) Constant and pro-active branded assets optimisation throughout the company What you need to be great in this role: To be self-motivated, working with little supervision, communicating clearly with a line manager about own development needs. Multimedia arts graduate/field related Good client engagement skills with the ability to proactively organize and lead discussions with clients and build strong and effective working relationships with brand managers The ability to manage and filter workflow and prioritise workloads to maximise productivity with given timeline Ability to take and challenge a client’s brief for clarity Some exposure and knowledge of working directly with clients without account management support Experience of providing clear and accurate Management information. Creative ability with strong adobe CS (InDesign, Illustrator and Photoshop) Good in After Effects or basic animation and video editing A good multitasker Guardian of the client’s brand guidelines, constantly challenging and developing them Working knowledge of digital design and its requirements is a benefit Passion for and inquisitive about AI and new technologies Understanding and knowledge of AI tools is beneficial, but ability to learn and digest benefits and features of AI tools is critical Req ID: 13896 Our values shape everything we do: Be Ambitious to succeed Be Imaginative to push the boundaries of what’s possible Be Inspirational to do groundbreaking work Be always learning and listening to understand Be Results-focused to exceed expectations Be actively pro-inclusive and anti-racist across our community, clients and creations OLIVER, a part of the Brandtech Group, is an equal opportunity employer committed to creating an inclusive working environment where all employees are encouraged to reach their full potential, and individual differences are valued and respected. All applicants shall be considered for employment without regard to race, ethnicity, religion, gender, sexual orientation, gender identity, age, neurodivergence, disability status, or any other characteristic protected by local laws. OLIVER has set ambitious environmental goals around sustainability, with science-based emissions reduction targets. Collectively, we work towards our mission, embedding sustainability into every department and through every stage of the project lifecycle.
Posted 1 day ago
7.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Title: Project Manger - GenAI Team Department : Artificial Intelligience / Generative AI Location Bangalore / Noida About the Company & Role : We are one of India’s premier integrated political consulting firms specializing in building data-driven 360-degree election campaigns. We help our clients with strategic advice and implementation which brings together data-backed insights and in-depth ground intelligence into a holistic electoral campaign. We are passionate about our democracy and the politics that shape the world around us. We draw on some of the sharpest minds from distinguished institutions and diverse professional backgrounds to help us achieve our goal. The team brings in 7 years of experience in building electoral strategies that spark conversations, effect change, and help shape electoral and legislative ecosystems in our country. As a part of the Analytics & Engineering Team , you will have a great chance to build tools and pipelines from early days by reinventing the concept of using data for the greater good. About the Role : We are seeking a Project Manager to lead our Generative AI (GenAI) initiatives in political consulting, managing AI-driven projects across political research, survey analytics, digital communications, and campaign intelligence. This role will oversee the end-to-end lifecycle of AI tools designed for predictive polling, voter sentiment analysis, and digital campaign optimization — ensuring delivery aligned with electoral timelines and campaign objectives. Key Responsibilities Political AI Project Management ● Define and execute project plans for GenAI-driven political research and communication tools. ● Lead AI initiatives in survey automation, trend projection, sentiment analysis, and booth-level voter modeling. ● Integrate AI insights with research data (quantitative & qualitative) and digital outreach strategies. Digital Communications Integration ● Oversee AI applications in social media monitoring, digital content generation, and audience targeting. ● Ensure insights from GenAI tools inform digital narratives, influencer strategies, and online reputation management. ● Collaborate with the digital team to optimize campaign outreach via predictive content recommendations. Cross-Functional Coordination ● Coordinate across Data Science, Engineering, Political Intelligence (PI), Quant, and Digital teams for seamless execution. ● Facilitate weekly sprint reviews with department POCs to track milestones and dependencies. ● Ensure GenAI outputs align with both strategic research objectives and digital campaign timelines. Quality & Risk Management ● Validate AI insights for accuracy, ethical compliance, and electoral sensitivity. ● Ensure data privacy and compliance with election commission regulations. Innovation & Strategic Impact ● Explore emerging GenAI frameworks (OpenAI, Gemini, Anthropic) for political communication and research applications. ● Automate manual workflows in surveys and digital outreach for improved operational efficiency. Required Sills & Qualifications Education ● Bachelor’s degree in Data Science, Computer Science, or related disciplines. ● MBA (preferred), especially with specialization in Strategy, Analytics, or Communications. ● PMP/Agile/Scrum certification is preferred. Experience ● 3-5 years in project management; 1–2 years in AI/ML projects preferred . ● Experience managing projects involving electoral research, voter data, and digital communications. Technical Skills ● Knowledge of LLMs and GenAI workflows for both research and communications. ● Proficiency in PM tools (Jira, Asana, Trello) and data visualization platforms (Tableau, Power BI).
Posted 1 day ago
1.0 years
5 - 14 Lacs
Mumbai, Maharashtra, India
On-site
Company: Mactores Website: Visit Website Business Type: Small/Medium Business Company Type: Service Business Model: B2B Funding Stage: Bootstrapped Industry: Data Analytics Salary Range: ₹ 5-14 Lacs PA Job Description Mactores is a trusted leader among businesses in providing modern data platform solutions. Since 2008, Mactores have been enabling businesses to accelerate their value through automation by providing End-to-End Data Solutions that are automated, agile, and secure. We collaborate with customers to strategize, navigate, and accelerate an ideal path forward with a digital transformation via assessments, migration, or modernization. As AWS Data Engineer, you are a full-stack data engineer that loves solving business problems. You work with business leads, analysts, and data scientists to understand the business domain and engage with fellow engineers to build data products that empower better decision-making. You are passionate about the data quality of our business metrics and the flexibility of your solution that scales to respond to broader business questions. If you love to solve problems using your skills, then come join the Team Mactores. We have a casual and fun office environment that actively steers clear of rigid "corporate" culture, focuses on productivity and creativity, and allows you to be part of a world-class team while still being yourself. What you will do? Write efficient code in - PySpark, Amazon Glue Write SQL Queries in - Amazon Athena, Amazon Redshift Explore new technologies and learn new techniques to solve business problems creatively Collaborate with many teams - engineering and business, to build better data products and services Deliver the projects along with the team collaboratively and manage updates to customers on time What are we looking for? 1 to 3 years of experience in Apache Spark, PySpark, Amazon Glue 2+ years of experience in writing ETL jobs using pySpark, and SparkSQL 2+ years of experience in SQL queries and stored procedures Have a deep understanding of all the Dataframe API with all the transformation functions supported by Spark 2.7+ You Will Be Preferred If You Have Prior experience in working on AWS EMR, Apache Airflow Certifications AWS Certified Big Data – Specialty OR Cloudera Certified Big Data Engineer OR Hortonworks Certified Big Data Engineer Understanding of DataOps Engineering
Posted 1 day ago
20.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Company Overview With 80,000 customers across 150 countries, UKG is the largest U.S.-based private software company in the world. And we’re only getting started. Ready to bring your bold ideas and collaborative mindset to an organization that still has so much more to build and achieve? Read on. At UKG, you get more than just a job. You get to work with purpose. Our team of U Krewers are on a mission to inspire every organization to become a great place to work through our award-winning HR technology built for all. Here, we know that you’re more than your work. That’s why our benefits help you thrive personally and professionally, from wellness programs and tuition reimbursement to U Choose — a customizable expense reimbursement program that can be used for more than 200+ needs that best suit you and your family, from student loan repayment, to childcare, to pet insurance. Our inclusive culture, active and engaged employee resource groups, and caring leaders value every voice and support you in doing the best work of your career. If you’re passionate about our purpose — people —then we can’t wait to support whatever gives you purpose. We’re united by purpose, inspired by you. Job Description We are seeking a seasoned Senior Director of Software Engineering with deep expertise in Data Platforms to lead and scale our data engineering organization. With deep industry experience, you will bring strategic vision, technical leadership, and operational excellence to drive innovation and deliver robust, scalable, and high-performing data solutions. You will partner closely with cross-functional teams to enable data-driven decision-making across the enterprise. Key Responsibilities Define and execute the engineering strategy for modern, scalable data platforms. Lead, mentor, and grow a high-performing engineering organization. Partner with product, architecture, and infrastructure teams to deliver resilient data solutions. Drive technical excellence through best practices in software development, data modeling, security, and automation. Oversee the design, development, and deployment of data pipelines, lakehouses, and real-time analytics platforms. Ensure platform reliability, availability, and performance through proactive monitoring and continuous improvement. Foster a culture of innovation, ownership, and continuous learning. 20+ years of experience in software engineering with a strong focus on data platforms and infrastructure. Proven leadership of large-scale, distributed engineering teams. Deep understanding of modern data architectures (e.g., data lakes, lakehouses, streaming, warehousing). Proficiency in cloud-native data platforms (e.g., AWS, Azure, GCP), big data ecosystems (e.g., Spark, Kafka, Hive), and data orchestration tools. Strong software development background with expertise in one or more languages such as Python, Java, or Scala. Demonstrated success in driving strategic technical initiatives and cross-functional collaboration. Strong communication and stakeholder management skills at the executive level. Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field (Ph.D. a plus). Where we’re going UKG is on the cusp of something truly special. Worldwide, we already hold the #1 market share position for workforce management and the #2 position for human capital management. Tens of millions of frontline workers start and end their days with our software, with billions of shifts managed annually through UKG solutions today. Yet it’s our AI-powered product portfolio designed to support customers of all sizes, industries, and geographies that will propel us into an even brighter tomorrow! UKG is proud to be an equal opportunity employer and is committed to promoting diversity and inclusion in the workplace, including the recruitment process. Disability Accommodation in the Application and Interview Process For individuals with disabilities that need additional assistance at any point in the application and interview process, please email UKGCareers@ukg.com
Posted 1 day ago
10.0 - 18.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Building on our past. Ready for the future Worley is a global professional services company of energy, chemicals and resources experts. We partner with customers to deliver projects and create value over the life of their assets. We’re bridging two worlds, moving towards more sustainable energy sources, while helping to provide the energy, chemicals and resources needed now. The Role As a Data Science Lead with Worley, you will work closely with our existing team to deliver projects for our clients while continuing to develop your skills and experience etc…. Conceptualise, build and manage AI/ML (more focus on unstructured data) platform by evaluating and selecting best in industry AI/ML tools and frameworks Drive and take ownership for developing cognitive solutions for internal stakeholders & external customers. Conduct research in various areas like Explainable AI, Image Segmentation,3D object detections and Statistical Methods Evaluate not only algorithms & models but also available tools & technologies in the market to maximize organizational spend. Utilize the existing frameworks, standards, patterns to create architectural foundation and services necessary for AI/ML applications that scale from multi-user to enterprise class. Analyse marketplace trends - economical, social, cultural and technological - to identify opportunities and create value propositions. Offer a global perspective in stakeholder discussions and when shaping solutions/recommendations Analyse marketplace trends - economical, social, cultural and technological - to identify opportunities and create value propositions. Offer a global perspective in stakeholder discussions and when shaping solutions/recommendations IT Skills & Experience Thorough understanding of complete AI/ML project life cycle to establish processes & provide guidance & expert support to the team. Expert knowledge of emerging technologies in Deep Learning and Reinforcement Learning Knowledge of MLOps process for efficient management of the AI/ML projects. Must have lead project execution with other data scientists/ engineers for large and complex data sets Understanding of machine learning algorithms, such as k-NN, GBM, Neural Networks Naive Bayes, SVM, and Decision Forests. Experience in the AI/ML components like Jupyter Hub, Zeppelin Notebook, Azure ML studio, Spark ML lib, TensorFlow, Tensor flow,Keras, Py-Torch and Sci-Kit Learn etc Strong knowledge of deep learning with special focus on CNN/R-CNN/LSTM/Encoder/Transformer architecture Hands-on experience with large networks like Inseption-Resnets,ResNeXT-50. Demonstrated capability using RNNS for text, speech data, generative models Working knowledge of NoSQL (GraphX/Neo4J), Document, Columnar and In-Memory database models Working knowledge of ETL tools and techniques, such as Talend,SAP BI Platform/SSIS or MapReduce etc. Experience in building KPI /storytelling dashboards on visualization tools like Tableau/Zoom data People Skills Professional and open communication to all internal and external interfaces. Ability to communicate clearly and concisely and a flexible mindset to handle a quickly changing culture Strong analytical skills. Industry Specific Experience 10 -18 Years of experience of AI/ML project execution and AI/ML research Education – Qualifications, Accreditation, Training Master or Doctroate degree Computer Science Engineering/Information Technology /Artificial Intelligence Moving forward together We’re committed to building a diverse, inclusive and respectful workplace where everyone feels they belong, can bring themselves, and are heard. We provide equal employment opportunities to all qualified applicants and employees without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other basis as protected by law. We want our people to be energized and empowered to drive sustainable impact. So, our focus is on a values-inspired culture that unlocks brilliance through belonging, connection and innovation. And we're not just talking about it; we're doing it. We're reskilling our people, leveraging transferable skills, and supporting the transition of our workforce to become experts in today's low carbon energy infrastructure and technology. Whatever your ambition, there’s a path for you here. And there’s no barrier to your potential career success. Join us to broaden your horizons, explore diverse opportunities, and be part of delivering sustainable change. Company Worley Primary Location IND-MM-Navi Mumbai Other Locations IND-KR-Bangalore, IND-MM-Mumbai, IND-MM-Pune, IND-TN-Chennai, IND-GJ-Vadodara, IND-AP-Hyderabad, IND-WB-Kolkata Job Digital Platforms & Data Science Schedule Full-time Employment Type Employee Job Level Experienced Job Posting Jul 4, 2025 Unposting Date Aug 3, 2025 Reporting Manager Title Head of Data Intelligence
Posted 1 day ago
2.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Your IT Future, Delivered. Senior Software Engineer (AI/ML Engineer) With a global team of 5600+ IT professionals, DHL IT Services connects people and keeps the global economy running by continuously innovating and creating sustainable digital solutions. We work beyond global borders and push boundaries across all dimensions of logistics. You can leave your mark shaping the technology backbone of the biggest logistics company of the world. All our offices have earned #GreatPlaceToWork certification, reflecting our commitment to exceptional employee experiences. Digitalization. Simply delivered. At DHL IT Services, we are designing, building and running IT solutions for the whole DPDHL globally. Grow together. The AI & Analytics team builds and runs solutions to get much more value out of our data. We help our business colleagues all over the world with machine learning algorithms, predictive models and visualizations. We manage more than 46 AI & Big Data Applications, 3.000 active users, 87 countries and up to 100,000,000 daily transaction. Integration of AI & Big Data into business processes to compete in a data driven world needs state of the art technology. Our infrastructure, hosted on-prem and in the cloud (Azure and GCP), includes MapR, Airflow, Spark, Kafka, jupyter, Kubeflow, Jenkins, GitHub, Tableau, Power BI, Synapse (Analytics), Databricks and further interesting tools. We like to do everything in an Agile/DevOps way. No more throwing the “problem code” to support, no silos. Our teams are completely product oriented, having end to end responsibility for the success of our product. Ready to embark on the journey? Here’s what we are looking for: Currently, we are looking for AI / Machine Learning Engineer . In this role, you will have the opportunity to design and develop solutions, contribute to roadmaps of Big Data architectures and provide mentorship and feedback to more junior team members. We are looking for someone to help us manage the petabytes of data we have and turn them into value. Does that sound a bit like you? Let’s talk! Even if you don’t tick all the boxes below, we’d love to hear from you; our new department is rapidly growing and we’re looking for many people with the can-do mindset to join us on our digitalization journey. Thank you for considering DHL as the next step in your career – we do believe we can make a difference together! What will you need? University Degree in Computer Science, Information Systems, Business Administration, or related field. 2+ years of experience in the Data Scienctist / Machine Learning Engineer role Strong analytic skills related to working with structured, semi structured and unstructured datasets. Advanced Machine learning techniques: Decision Trees, Random Forest, Boosting Algorithm, Neural Networks, Deep Learning, Support Vector Machines, Clustering, Bayesian Networks, Reinforcement Learning, Feature Reduction / engineering, Anomaly deduction, Natural Language Processing (incl. sentiment analysis, Topic Modeling), Natural Language Generation. Statistics / Mathematics: Data Quality Analysis, Data identification, Hypothesis testing, Univariate / Multivariate Analysis, Cluster Analysis, Classification/PCA, Factor Analysis, Linear Modeling, Time Series, distribution / probability theory and/or Strong experience in specialized analytics tools and technologies (including, but not limited to) Lead the integration of large language models into AI applications. Very good in Python Programming. Power BI, Tableau Develop the application and deploy the model in production. Kubeflow, ML Flow, Airflow, Jenkins, CI/CD Pipeline. As an AI/ML Engineer, you will be responsible for developing applications and systems that leverage AI tools, Cloud AI services, and Generative AI models. Your role includes designing cloud-based or on-premises application pipelines that meet production-ready standards, utilizing deep learning, neural networks, chatbots, and image processing technologies. Professional & Technical Skills: Essential Skills: Expertise in Large Language Models. Strong knowledge of statistical analysis and machine learning algorithms. Experience with data visualization tools such as Tableau or Power BI. Practical experience with various machine learning algorithms, including linear regression, logistic regression, decision trees, and clustering techniques. Proficient in data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Awareness of Apache Spark, Hadoop Awareness of Agile / Scrum ways of working. Identify the right modeling approach(es) for given scenario and articulate why the approach fits. Assess data availability and modeling feasibility. Review interpretation of models results. Experience in Logistic industry domain would be added advantage. Roles & Responsibilities: Act as a Subject Matter Expert (SME). Collaborate with and manage team performance. Make decisions that impact the team. Work with various teams and contribute to significant decision-making processes. Provide solutions to challenges that affect multiple teams. Lead the integration of large language models into AI applications. Research and implement advanced AI techniques to improve system performance. Assist in the development and deployment of AI solutions across different domains. You should have: Certifications in some of the core technologies. Ability to collaborate across different teams/geographies/stakeholders/levels of seniority. Customer focus with an eye on continuous improvement. Energetic, enthusiastic and results-oriented personality. Ability to coach other team members, you must be a team player! Strong will to overcome the complexities involved in developing and supporting data pipelines. Language requirements: English – Fluent spoken and written (C1 level) An array of benefits for you: Hybrid work arrangements to balance in-office collaboration and home flexibility. Annual Leave: 42 days off apart from Public / National Holidays. Medical Insurance: Self + Spouse + 2 children. An option to opt for Voluntary Parental Insurance (Parents / Parent -in-laws) at a nominal premium covering pre existing disease. In House training programs: professional and technical training certifications.
Posted 1 day ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Before you apply to a job, select your language preference from the options available at the top right of this page. Explore your next opportunity at a Fortune Global 500 organization. Envision innovative possibilities, experience our rewarding culture, and work with talented teams that help you become better every day. We know what it takes to lead UPS into tomorrow—people with a unique combination of skill + passion. If you have the qualities and drive to lead yourself or teams, there are roles ready to cultivate your skills and take you to the next level. Job Description JOB SUMMARY This position participates in the design, build, test, and delivery of Machine Learning (ML) models and software components that solve challenging business problems for the organization, working in collaboration with the Business, Product, Architecture, Engineering, and Data Science teams. This position engages in assessment and analysis of data sources of structured and unstructured data (internal and external) to uncover opportunities for ML and Artificial Intelligence (AI) automation, predictive methods, and quantitative modeling across the organization. This position establishes and configures scalable and cost-effective end to end solution design pattern components to support prediction model transactions. This position designs trials and tests to measure the success of software and systems, and works with teams, or individually, to implement ML/AI models for production scale. Responsibilities The MLOPS developer works on maintaining existing models that are supporting applications such as the digital insurance application and claims recommendation engine. They will be responsible for setting up cloud monitoring jobs, performing quality assurance and testing for edge cases to ensure the ML product works within the application. They are also going to need to be on call on weekends to bring the application back online in case of failure. Studies and transforms data science prototypes into ML systems using appropriate datasets and data representation models. Researches and implements appropriate ML algorithms and tools that creates new systems and processes powered with ML and AI tools and techniques according to business requirements Collaborates with others to deliver ML products and systems for the organization. Designs workflows and analysis tools to streamline the development of new ML models at scale. Creates and evolves ML models and software that enable state-of-the-art intelligent systems using best practices in all aspects of engineering and modelling lifecycles. Extends existing ML libraries and frameworks with the developments in the Data Science and Machine Learning field. Establishes, configures, and supports scalable Cloud components that serve prediction model transactions Integrates data from authoritative internal and external sources to form the foundation of a new Data Product that would deliver insights that supports business outcomes necessary for ML systems. Qualifications Requirements: Ability to code in python/spark with enough knowledge of apache to build apache beam jobs in dataproc to build data transfer jobs. Experience designing and building data-intensive solutions using distributed computing within a multi-line business environment. Familiarity in Machine Learning and Artificial Intelligence frameworks (i.e., Keras, PyTorch), libraries (i.e., scikit-learn), and tools and Cloud-AI technologies that aids in streamlining the development of Machine Learning or AI systems. Experience in establishing and configuring scalable and cost-effective end to end solution design pattern components to support the serving of batch and live streaming prediction model transactions Possesses creative and critical thinking skills. Experience in developing Machine Learning models such as: Classification/Regression Models, NLP models, and Deep Learning models; with a focus on productionizing those models into product features. Experience with scalable data processing, feature development, and model optimization. Solid understanding of statistics such as forecasting, time series, hypothesis testing, classification, clustering or regression analysis, and how to apply that knowledge in understanding and evaluating Machine Learning models. Knowledgeable in software development lifecycle (SDLM), Agile development practices and cloud technology infrastructures and patterns related to product development Advanced math skills in Linear Algebra, Bayesian Statistics, Group Theory. Works collaboratively, both in a technical and cross-functional context. Strong written and verbal communication. Bachelors’ (BS/BA) degree in a quantitative field of mathematics, computer science, physics, economics, engineering, statistics (operations research, quantitative social science, etc.), international equivalent, or equivalent job experience. Employee Type Permanent UPS is committed to providing a workplace free of discrimination, harassment, and retaliation.
Posted 1 day ago
10.0 - 18.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Building on our past. Ready for the future Worley is a global professional services company of energy, chemicals and resources experts. We partner with customers to deliver projects and create value over the life of their assets. We’re bridging two worlds, moving towards more sustainable energy sources, while helping to provide the energy, chemicals and resources needed now. The Role As a Data Science Lead with Worley, you will work closely with our existing team to deliver projects for our clients while continuing to develop your skills and experience etc…. Conceptualise, build and manage AI/ML (more focus on unstructured data) platform by evaluating and selecting best in industry AI/ML tools and frameworks Drive and take ownership for developing cognitive solutions for internal stakeholders & external customers. Conduct research in various areas like Explainable AI, Image Segmentation,3D object detections and Statistical Methods Evaluate not only algorithms & models but also available tools & technologies in the market to maximize organizational spend. Utilize the existing frameworks, standards, patterns to create architectural foundation and services necessary for AI/ML applications that scale from multi-user to enterprise class. Analyse marketplace trends - economical, social, cultural and technological - to identify opportunities and create value propositions. Offer a global perspective in stakeholder discussions and when shaping solutions/recommendations Analyse marketplace trends - economical, social, cultural and technological - to identify opportunities and create value propositions. Offer a global perspective in stakeholder discussions and when shaping solutions/recommendations IT Skills & Experience Thorough understanding of complete AI/ML project life cycle to establish processes & provide guidance & expert support to the team. Expert knowledge of emerging technologies in Deep Learning and Reinforcement Learning Knowledge of MLOps process for efficient management of the AI/ML projects. Must have lead project execution with other data scientists/ engineers for large and complex data sets Understanding of machine learning algorithms, such as k-NN, GBM, Neural Networks Naive Bayes, SVM, and Decision Forests. Experience in the AI/ML components like Jupyter Hub, Zeppelin Notebook, Azure ML studio, Spark ML lib, TensorFlow, Tensor flow,Keras, Py-Torch and Sci-Kit Learn etc Strong knowledge of deep learning with special focus on CNN/R-CNN/LSTM/Encoder/Transformer architecture Hands-on experience with large networks like Inseption-Resnets,ResNeXT-50. Demonstrated capability using RNNS for text, speech data, generative models Working knowledge of NoSQL (GraphX/Neo4J), Document, Columnar and In-Memory database models Working knowledge of ETL tools and techniques, such as Talend,SAP BI Platform/SSIS or MapReduce etc. Experience in building KPI /storytelling dashboards on visualization tools like Tableau/Zoom data People Skills Professional and open communication to all internal and external interfaces. Ability to communicate clearly and concisely and a flexible mindset to handle a quickly changing culture Strong analytical skills. Industry Specific Experience 10 -18 Years of experience of AI/ML project execution and AI/ML research Education – Qualifications, Accreditation, Training Master or Doctroate degree Computer Science Engineering/Information Technology /Artificial Intelligence Moving forward together We’re committed to building a diverse, inclusive and respectful workplace where everyone feels they belong, can bring themselves, and are heard. We provide equal employment opportunities to all qualified applicants and employees without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other basis as protected by law. We want our people to be energized and empowered to drive sustainable impact. So, our focus is on a values-inspired culture that unlocks brilliance through belonging, connection and innovation. And we're not just talking about it; we're doing it. We're reskilling our people, leveraging transferable skills, and supporting the transition of our workforce to become experts in today's low carbon energy infrastructure and technology. Whatever your ambition, there’s a path for you here. And there’s no barrier to your potential career success. Join us to broaden your horizons, explore diverse opportunities, and be part of delivering sustainable change. Company Worley Primary Location IND-MM-Navi Mumbai Other Locations IND-KR-Bangalore, IND-MM-Mumbai, IND-MM-Pune, IND-TN-Chennai, IND-GJ-Vadodara, IND-AP-Hyderabad, IND-WB-Kolkata Job Digital Platforms & Data Science Schedule Full-time Employment Type Employee Job Level Experienced Job Posting Jul 4, 2025 Unposting Date Aug 3, 2025 Reporting Manager Title Head of Data Intelligence
Posted 1 day ago
3.0 - 8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Notice period 30 days to immediate Role description Myrefers GCP PythonApache beamp3 to 8 years of overall IT experience which includes hands on experience in Big Data technologies Mandatory Hands on experience in Python and PySpark Python as a language is practically usable for anything we are looking for application Development and Extract Transform Load and Data lake curation experience using Python Build pySpark applications using Spark Dataframes in Python using Jupyter notebook and PyCharm IDE Worked on optimizing spark jobs that processes huge volumes of data Hands on experience in version control tools like Git Worked on Amazons Analytics services like Amazon EMR Amazon Athena AWS Glue Worked on Amazons Compute services like Amazon Lambda Amazon EC2 and Amazons Storage service like S3 and few other services like SNS Experience knowledge of bash shell scripting will be a plus Has built ETL processes to take data copy it structurally transform it etc involving a wide variety of formats like CSV TSV XML and JSON Experience in working with fixed width delimited multi record file formats etc Good to have knowledge of datawarehousing concepts dimensions facts schemas snowflake star etc Have worked with columnar storage formats Parquet Avro ORC etc Well versed with compression techniques Snappy Gzip Good to have knowledge of AWS databases atleast one Aurora RDS Redshift ElastiCache DynamoDB Skills Mandatory Skills :GCP, Apache Spark,Python,SparkSQL,Big Data Hadoop Ecosystem
Posted 1 day ago
2.0 - 4.0 years
0 Lacs
Tamil Nadu, India
On-site
Join us as we work to create a thriving ecosystem that delivers accessible, high-quality, and sustainable healthcare for all. As the Member of Technical Staff (MTS) you would play a vital role in supporting the development and implementation of applications. You will contribute to various stages of the software development lifecycle, working closely with senior developers and other team members. Responsibilities Participate in the design, development, testing, and deployment of applications. Gain experience with various AWS services including EC2,ECS, S3, SQS, Lambda, RDS PostgreSQL, etc. Write clean, well-documented, and maintainable code adhering to best practices and coding standards. Collaborate with senior developers to understand legacy code bases and contribute to modernization efforts where applicable. Conduct unit and integration testing to ensure code quality and functionality. Participate in code reviews and actively learn from senior developers' feedback. Qualifications 2-4 years of experience in software development, with a strong foundation in front-end and back-end technologies. Solid understanding of cloud computing concepts and knowledge of working with AWS, IaC (Terraform). Exposure to object oriented programming & data structures, relational database technologies, Spring, RESTful API , modern JS frameworks and Pe Proficiency in development tools, frameworks, and methodologies (e.g., React, Git, Agile development). Excellent communication, collaboration, and problem-solving skills. Strong work ethic with a willingness to learn and grow within the team. Behaviours & Abilities Required Ability to learn and adapt in a fast-paced environment, while producing quality code Ability to work collaboratively on a cross-functional team with a wide range of experience levels. Ability to write code that is technically sound, performant, scalable, and readable. Ability to collaborate with business owners to understand and refine business requirements. Ability and willingness to demonstrate ownership of an area of Athena's technology About Athenahealth Our vision: In an industry that becomes more complex by the day, we stand for simplicity. We offer IT solutions and expert services that eliminate the daily hurdles preventing healthcare providers from focusing entirely on their patients — powered by our vision to create a thriving ecosystem that delivers accessible, high-quality, and sustainable healthcare for all. Our company culture: Our talented employees — or athenistas, as we call ourselves — spark the innovation and passion needed to accomplish our vision. We are a diverse group of dreamers and do-ers with unique knowledge, expertise, backgrounds, and perspectives. We unite as mission-driven problem-solvers with a deep desire to achieve our vision and make our time here count. Our award-winning culture is built around shared values of inclusiveness, accountability, and support. Our DEI commitment: Our vision of accessible, high-quality, and sustainable healthcare for all requires addressing the inequities that stand in the way. That's one reason we prioritize diversity, equity, and inclusion in every aspect of our business, from attracting and sustaining a diverse workforce to maintaining an inclusive environment for athenistas, our partners, customers and the communities where we work and serve. What We Can Do For You Along with health and financial benefits, athenistas enjoy perks specific to each location, including commuter support, employee assistance programs, tuition assistance, employee resource groups, and collaborative workspaces — some offices even welcome dogs. We also encourage a better work-life balance for athenistas with our flexibility. While we know in-office collaboration is critical to our vision, we recognize that not all work needs to be done within an office environment, full-time. With consistent communication and digital collaboration tools, athenahealth enables employees to find a balance that feels fulfilling and productive for each individual situation. In addition to our traditional benefits and perks, we sponsor events throughout the year, including book clubs, external speakers, and hackathons. We provide athenistas with a company culture based on learning, the support of an engaged team, and an inclusive environment where all employees are valued. Learn more about our culture and benefits here: athenahealth.com/careers https://www.athenahealth.com/careers/equal-opportunity
Posted 1 day ago
2.0 years
3 - 4 Lacs
Goa
On-site
What You’ll Do · Collaborate with cross-functional teams to craft and refine financial models, forecasts, and budgets. · Dive into the depths of financial data, uncovering insights, and unraveling mysteries behind performance variances. · Craft compelling financial reports and presentations that enlighten and empower decision-makers. · Partner with stakeholders to streamline processes, enhance efficiency, and spark innovation. · Lead the charge in driving continuous improvement through data analysis, reporting enhancements, and automation initiatives. · Embrace the opportunity to tackle special projects and deliver impactful solutions that propel our organization forward. Qualifications: · Bachelor's degree (BS/BA) in Finance, Accounting, Economics, or a related field · 2-5 years of hands-on experience in Financial Planning & Analysis (FP&A)? · Are you a wizard with Microsoft Excel, capable of weaving magic with large datasets, intricate models, and dynamic reports? · Do you possess a burning desire to crack complex problems and build solutions from scratch? · Can you dive deep into data oceans, surfacing trends that influence sales, operations, and finance? · Are you a tech-savvy guru who thrives on automating processes and leveraging cutting-edge tools? Job Types: Full-time, Permanent Pay: ₹30,000.00 - ₹35,000.00 per month Benefits: Health insurance Leave encashment Paid time off Provident Fund Work Location: In person Speak with the employer +91 9423955679
Posted 1 day ago
10.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Reference # 321741BR Job Type Full Time Your role develop and maintain solutions focused on data processing and analysis using java 11 and any distributed computing engine collaborate with stakeholders to integrate complex analytics into the platform optimize and enhance existing platform components to improve performance, scalability, and reliability work closely with cross-functional teams to implement new features that support portfolio reporting and data modeling. contribute to system architecture discussions and promote best practices in coding, testing and deployment Your team You'll be working in the Wealth Management IT team in Pune. You will be part of a global team that supports applications in our Core Operations arena working closely with the development and business teams that are global as well. Your expertise 10+ years of experience in software development, preferably in the financial or data analytics industry proficiency in java 17 with strong understanding of modern java practices and performance optimizations experience with apache storm or other distributed processing frameworks (e.g., spark, flink, kafka streams) understanding of cloud platforms, particularly microsoft azure including services like, azure data lake, azure blob storage, azure data factory spark (or any other distributed big data processing engine – flink, trino etc.) aks (or any other kubernetes provider) familiarity with ci/cd pipelines and cloud environments strong problem-solving skills and ability to work on a fast-paced dynamic environment About Us UBS is the world’s largest and the only truly global wealth manager. We operate through four business divisions: Global Wealth Management, Personal & Corporate Banking, Asset Management and the Investment Bank. Our global reach and the breadth of our expertise set us apart from our competitors. We have a presence in all major financial centers in more than 50 countries. How We Hire We may request you to complete one or more assessments during the application process. Learn more Join us At UBS, we know that it's our people, with their diverse skills, experiences and backgrounds, who drive our ongoing success. We’re dedicated to our craft and passionate about putting our people first, with new challenges, a supportive team, opportunities to grow and flexible working options when possible. Our inclusive culture brings out the best in our employees, wherever they are on their career journey. We also recognize that great work is never done alone. That’s why collaboration is at the heart of everything we do. Because together, we’re more than ourselves. We’re committed to disability inclusion and if you need reasonable accommodation/adjustments throughout our recruitment process, you can always contact us. Disclaimer / Policy Statements UBS is an Equal Opportunity Employer. We respect and seek to empower each individual and support the diverse cultures, perspectives, skills and experiences within our workforce.
Posted 1 day ago
4.0 - 7.0 years
5 - 8 Lacs
Hyderābād
On-site
About NationsBenefits: At NationsBenefits, we are leading the transformation of the insurance industry by developing innovative benefits management solutions. We focus on modernizing complex back-office systems to create scalable, secure, and high-performing platforms that streamline operations for our clients. As part of our strategic growth, we are focused on platform modernization — transitioning legacy systems to modern, cloud-native architectures that support the scalability, reliability, and high performance of core back-office functions in the insurance domain. Position Overview: We are seeking a self-driven Data Engineer with 4–7 years of experience to build and optimize scalable ETL/ELT pipelines using Azure Databricks, PySpark, and Delta Lake. The role involves working across scrum teams to develop data solutions, ensure data governance with Unity Catalog, and support real-time and batch processing. Strong problem-solving skills, T-SQL expertise, and hands-on experience with Azure cloud tools are essential. Healthcare domain knowledge is a plus. Job Description: Work with different scrum teams to develop all the quality database programming requirements of the sprint. Experience in Azure cloud platforms like Advanced Python Programming, Databricks , Azure SQL , Data factory (ADF), Data Lake, Data storage, SSIS. Create and deploy scalable ETL/ELT pipelines with Azure Databricks by utilizing PySpark and SQL . Create Delta Lake tables with ACID transactions and schema evolution to support real-time and batch processing. Experience in Unity Catalog for centralized data governance, access control, and data lineage tracking. Independently analyse, solve, and correct issues in real time, providing problem resolution end-to-end. Develop unit tests to be able to test them automatically. Use SOLID development principles to maintain data integrity and cohesiveness. Interact with product owner and business representatives to determine and satisfy needs. Sense of ownership and pride in your performance and its impact on company's success. Critical thinker and problem-solving skills. Team player. Good time-management skills. Great interpersonal and communication skills. Mandatory Qualifications: 4-7 years of experience as a Data Engineer. Self-driven with minimal supervision. Proven experience with T-SQL programming, Azure Databricks, Spark (PySpark/Scala), Delta Lake, Unity Catalog, ADLS Gen2 Microsoft TFS, Visual Studio, Devops exposure. Experience with cloud platforms such as Azure or any. Analytical, problem-solving mindset. HealthCare domain knowledge Preferred Qualifications Healthcare Domain Knowledge
Posted 1 day ago
12.0 years
5 - 10 Lacs
Hyderābād
On-site
Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. [Senior Manager Software Development Engineering] What you will do Let’s do this. Let’s change the world. In this vital role you will be responsible for designing, developing, and maintaining software applications and solutions that meet business needs and ensuring the availability and performance of critical systems and applications. This role involves working closely with product managers, designers, and other engineers to create high-quality, scalable software solutions and automating operations, monitoring system health, and responding to incidents to minimize downtime. Roles & Responsibilities: Provide technical leadership to enhance the culture of innovation, automation, and solving difficult scientific and business challenges. Technical leadership includes providing vision and direction to develop scalable reliable solutions. Provide leadership to select right-sized and appropriate tools and architectures based on requirements, data source format, and current technologies Develop, refactor, research and improve Weave cloud platform capabilities. Understand business drivers and technical needs so our cloud services seamlessly, automatically, and securely provides them the best service. Develop data flow pipelines to extract, transform, and load data from various data sources in various forms, including custom ETL pipelines that enable model and product development Build strong partnership with stakeholder Build data products and service processes which perform data transformation, metadata extraction, workload management and error processing management to ensure high quality data Provide clear documentation for delivered solutions and processes, integrating documentation Collaborate with business partners to understand user stories and ensure technical solution/build can deliver to those needs Work with multi-functional teams to design and document effective and efficient solutions. Develop change management strategies and assist in their implementation. Mentor junior data engineers on standard methodologies in the industry and in the Amgen data landscape What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Basic Qualifications and Experience: Doctorate Degree /Master's degree / Bachelor's degree and 12to 17 years Computer Science, IT or related field experience Preferred Skills: Must-Have Skills: Superb communication and interpersonal skills, with the ability to work cross-functionally with multi-functional GTM, product, and engineering teams. Minimum of 10+ years overall Software Engineer or Cloud Architect experience Minimum 3+ years in architecture role using public cloud solutions such as AWS Experience with AWS Technology stack Good-to-Have Skills: Familiarity with big data technologies, AI platforms, and cloud-based data solutions. Ability to work effectively across matrixed organizations and lead collaboration between data and AI teams. Passion for technology and customer success, particularly in driving innovative AI and data solutions. Experience working with teams of data scientists, software engineers and business experts to drive insights Experience with AWS Services such as EC2, S3, Redshift/Spectrum, Glue, Athena, RDS, Lambda, and API gateway. Experience with Big Data Technologies (Hadoop, Hive, Hbase, Pig, Spark, etc) Good understanding of relevant data standards and industry trends Ability to understand new business requirements and prioritize them for delivery Experience working in biopharma/life sciences industry Proficient in one of the coding languages (Python, Java, Scala) Hands on experience writing SQL using any RDBMS (Redshift, Postgres, MySQL, Teradata, Oracle, etc.). Experience with Schema Design & Dimensional data modeling. Experience with software DevOps CI/CD tools, such Git, Jenkins, Linux, and Shell Script Hands on experience using Databricks/Jupyter or similar notebook environment. Experience working with GxP systems Experience working in an agile environment (i.e. user stories, iterative development, etc.) Experience working with test-driven development and software test automation Experience working in a Product environment Good overall understanding of business, manufacturing, and laboratory systems common in the pharmaceutical industry, as well as the integration of these systems through applicable standards. Soft Skills: Excellent analytical and troubleshooting skills. Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to handle multiple priorities successfully. Team-oriented, with a focus on achieving team goals What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 1 day ago
7.0 - 9.0 years
6 - 8 Lacs
Hyderābād
On-site
General information Country India State Telangana City Hyderabad Job ID 45479 Department Development Description & Requirements Senior Java Developer is responsible for architecting and developing advanced Java solutions. This role involves leading the design and implementation of microservice architectures with Spring Boot, optimizing services for performance and scalability, and ensuring code quality. The Senior Developer will also mentor junior developers and collaborate closely with cross-functional teams to deliver comprehensive technical solutions. Essential Duties: Lead the development of scalable, robust, and secure Java components and services. Architect and optimize microservice solutions using Spring Boot. Translate customer requirements into comprehensive technical solutions. Conduct code reviews and maintain high code quality standards. Optimize and scale microservices for performance and reliability. Collaborate effectively with cross-functional teams to innovate and develop solutions. Experience in leading projects and mentoring engineers in best practices and innovative solutions. Coordinate with customer and client-facing teams for effective solution delivery. Basic Qualifications: Bachelor’s degree in Computer Science or a related field. 7-9 years of experience in Java development. Expertise in designing and implementing Microservices with Spring Boot. Extensive experience in applying design patterns, system design principles, and expertise in event-driven and domain-driven design methodologies. Extensive experience with multithreading, asynchronous and defensive programming. Proficiency in MongoDB, SQL databases, and S3 data storage. Experience with Kafka, Kubernetes, AWS services & AWS SDK. Hands-on experience with Apache Spark. Strong knowledge of Linux, Git, and Docker. Familiarity with Agile methodologies and tools like Jira and Confluence. Excellent communication and leadership skills. Preferred Qualifications Experience with Spark using Spring Boot. Familiarity with the C4 Software Architecture Model. Experience using tools like Lucidchart for architecture and flow diagrams. About Infor Infor is a global leader in business cloud software products for companies in industry specific markets. Infor builds complete industry suites in the cloud and efficiently deploys technology that puts the user experience first, leverages data science, and integrates easily into existing systems. Over 60,000 organizations worldwide rely on Infor to help overcome market disruptions and achieve business-wide digital transformation. For more information visit www.infor.com Our Values At Infor, we strive for an environment that is founded on a business philosophy called Principle Based Management™ (PBM™) and eight Guiding Principles: integrity, stewardship & compliance, transformation, principled entrepreneurship, knowledge, humility, respect, self-actualization. Increasing diversity is important to reflect our markets, customers, partners, and communities we serve in now and in the future. We have a relentless commitment to a culture based on PBM. Informed by the principles that allow a free and open society to flourish, PBM™ prepares individuals to innovate, improve, and transform while fostering a healthy, growing organization that creates long-term value for its clients and supporters and fulfillment for its employees. Infor is an Equal Opportunity Employer. We are committed to creating a diverse and inclusive work environment. Infor does not discriminate against candidates or employees because of their sex, race, gender identity, disability, age, sexual orientation, religion, national origin, veteran status, or any other protected status under the law. If you require accommodation or assistance at any time during the application or selection processes, please submit a request by following the directions located in the FAQ section at the bottom of the infor.com/about/careers webpage.
Posted 1 day ago
8.0 years
4 - 9 Lacs
Hyderābād
Remote
Tribute Technology is an established best-in-class Software as a Service technology company and solutions provider. Our customers include some of the largest and most prominent media brands in the world, spanning 4 continents and reaching millions of users every day. Our mission is to make meaningful connections between our customers and their users through innovation and a commitment to excellent user experience. ABOUT TRIBUTE TECHNOLOGY: At Tribute Technology, we make end-of-life celebrations memorable, meaningful, and effortless through thoughtful and innovative technology solutions. Our mission is to help communities around the world celebrate life and pay tribute to those we love. Our comprehensive platform brings together software and technology to provide a fully integrated experience for all users, whether that is a family, a funeral home, or an online publisher. We are the market leader in the US and Canada, with global expansion plans and a growing international team of more than 400 individuals in the US, Canada, Philippines, Ukraine and India. ABOUT YOU: Tribute Technology is actively seeking a motivated an experienced Director, Data Engineering with expertise in AWS and GCP to join our Data & Analytics team. Our Data & Analytics team drives innovation and excellence by harnessing data to deliver actionable insights across the organization. As Director, you will play a crucial role in leading and enhancing our data pipelines, processes and visualizations as well as selecting fit for purpose tools to support our growing business needs. This is an exciting opportunity for a data enthusiast with a strong record of success in managing complex data initiatives. If you have a strong understanding of AWS and GCP and a passion for leading and developing a team, we want to hear from you. Join us and be a part of our innovative and collaborative environment where your contributions will have a significant impact. KEY RESPONSIBILITIES: Lead and manage a team of data engineers to ensure timely delivery of data solutions. Collaborate with cross-functional teams to understand business requirements and translate them into data engineering solutions. Design and implement data pipelines and processes to integrate data from various sources into our data ecosystem. Improve data observability and ensure the team follows best practices. Ensure timely resolution for data pipeline issues implementing a monitoring and incident response strategy. Implement best practices for data governance, ensuring high data quality, consistency, and security across the data pipelines. Foster a collaborative environment, ensuring the team works cross-functionally with business stakeholders to deliver projects effectively. Collaborate with other technology leaders to ensure the data environment is reliable and resilient, adapting over time to serve as a consistent backbone for data-driven solutions to support business objectives. Create documentation of the pipelines, maintain data asset documentation and data product documentation to ensure visibility and discoverability of our data products. Mentor and coach team members to enhance their technical skills and promote a culture of continuous learning and development. Review and weigh in on functional and technical requirements. Participate in strategic planning and budgeting for data engineering activities. EDUCATION AND/OR EXPERIENCE: Bachelor’s degree in computer science, Engineering or relevant field of study with minimum 8 years of progressively responsible technology experience with 3+ years as a data engineering lead. Prior experience in an eCommerce business with high volume data (~1B sessions annually) and integrating data across multiple business units. Strong knowledge of data engineering technologies such as Python, SQL, Spark, real time streaming pipelines, and data warehouse solutions. Proven data engineering experience in cloud environments. Proven experience leading teams, fostering a high-performance culture, and managing distributed teams is preferred. Strong people management skills, including recruitment, performance management, and fostering growth and innovation within the team. Experience in creating data products and/or data product environment. BENEFITS: Competitive salary Fully remote across India An outstanding collaborative work environment Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions of the position #LI-remote
Posted 1 day ago
2.0 years
0 Lacs
Hyderābād
On-site
Job Description Overview Data Science Team works in developing Machine Learning (ML) and Artificial Intelligence (AI) projects. Specific scope of this role is to develop ML solution in support of ML/AI projects using big analytics toolsets in a CI/CD environment. Analytics toolsets may include DS tools/Spark/Databricks, and other technologies offered by Microsoft Azure or open-source toolsets. This role will also help automate the end-to-end cycle with Azure Pipelines. You will be part of a collaborative interdisciplinary team around data, where you will be responsible of our continuous delivery of statistical/ML models. You will work closely with process owners, product owners and final business users. This will provide you the correct visibility and understanding of criticality of your developments. Responsibilities Delivery of key Advanced Analytics/Data Science projects within time and budget, particularly around DevOps/MLOps and Machine Learning models in scope Active contributor to code & development in projects and services Partner with data engineers to ensure data access for discovery and proper data is prepared for model consumption. Partner with ML engineers working on industrialization. Communicate with business stakeholders in the process of service design, training and knowledge transfer. Support large-scale experimentation and build data-driven models. Refine requirements into modelling problems. Influence product teams through data-based recommendations. Research in state-of-the-art methodologies. Create documentation for learnings and knowledge transfer. Create reusable packages or libraries. Ensure on time and on budget delivery which satisfies project requirements, while adhering to enterprise architecture standards Leverage big data technologies to help process data and build scaled data pipelines (batch to real time) Implement end-to-end ML lifecycle with Azure Databricks and Azure Pipelines Automate ML models deployments Qualifications BE/B.Tech in Computer Science, Maths, technical fields. Overall 2-4 years of experience working as a Data Scientist. 2+ years’ experience building solutions in the commercial or in the supply chain space. 2+ years working in a team to deliver production level analytic solutions. Fluent in git (version control). Understanding of Jenkins, Docker are a plus. Fluent in SQL syntaxis. 2+ years’ experience in Statistical/ML techniques to solve supervised (regression, classification) and unsupervised problems. 2+ years’ experience in developing business problem related statistical/ML modeling with industry tools with primary focus on Python or Pyspark development. Data Science – Hands on experience and strong knowledge of building machine learning models – supervised and unsupervised models. Knowledge of Time series/Demand Forecast models is a plus Programming Skills – Hands-on experience in statistical programming languages like Python, Pyspark and database query languages like SQL Statistics – Good applied statistical skills, including knowledge of statistical tests, distributions, regression, maximum likelihood estimators Cloud (Azure) – Experience in Databricks and ADF is desirable Familiarity with Spark, Hive, Pig is an added advantage Business storytelling and communicating data insights in business consumable format. Fluent in one Visualization tool. Strong communications and organizational skills with the ability to deal with ambiguity while juggling multiple priorities Experience with Agile methodology for team work and analytics ‘product’ creation. Experience in Reinforcement Learning is a plus. Experience in Simulation and Optimization problems in any space is a plus. Experience with Bayesian methods is a plus. Experience with Causal inference is a plus. Experience with NLP is a plus. Experience with Responsible AI is a plus. Experience with distributed machine learning is a plus Experience in DevOps, hands-on experience with one or more cloud service providers AWS, GCP, Azure(preferred) Model deployment experience is a plus Experience with version control systems like GitHub and CI/CD tools Experience in Exploratory data Analysis Knowledge of ML Ops / DevOps and deploying ML models is preferred Experience using MLFlow, Kubeflow etc. will be preferred Experience executing and contributing to ML OPS automation infrastructure is good to have Exceptional analytical and problem-solving skills Stakeholder engagement-BU, Vendors. Experience building statistical models in the Retail or Supply chain space is a plus
Posted 1 day ago
5.0 - 10.0 years
0 Lacs
Hyderābād
On-site
Job Description Overview DataOps L3 The role will leverage & enhance existing technologies in the area of data and analytics solutions like Power BI, Azure data engineering technologies, ADLS, ADB, Synapse, and other Azure services. The role will be responsible for developing and support IT products and solutions using these technologies and deploy them for business users Responsibilities 5 to 10 Years of IT & Azure Data engineering technologies experience Prior experience in ETL, data pipelines, data flow techniques using Azure Data Services Working experience in Python, Py Spark, Azure Data Factory, Azure Data Lake Gen2, Databricks, Azure Synapse and file formats like JSON & Parquet. Experience in creating ADF Pipelines to source and process data sets. Experience in creating Databricks notebooks to cleanse, transform and enrich data sets. Development experience in orchestration of pipelines Good understanding about SQL, Databases, Datawarehouse systems preferably Teradata Experience in deployment and monitoring techniques. Working experience with Azure DevOps CI/CD pipelines to deploy Azure resources. Experience in handling operations/Integration with source repository Must have good knowledge on Datawarehouse concepts and Datawarehouse modelling. Working knowledge of SNOW including resolving incidents, handling Change requests /Service requests, reporting on metrics to provide insights. Collaborate with the project team to understand tasks to model tables using data warehouse best practices and develop data pipelines to ensure the efficient delivery of data. Strong expertise in performance tuning and optimization of data processing systems. Proficient in Azure Data Factory, Azure Databricks, Azure SQL Database, and other Azure data services. Develop and enforce best practices for data management, including data governance and security. Work closely with cross-functional teams to understand data requirements and deliver solutions that meet business needs. Proficient in implementing DataOps framework. Qualifications Azure data factory Azure Databricks Azure Synapse PySpark/SQL ADLS Azure DevOps with CI/CD implementation. Nice-to-Have Skill Sets: Business Intelligence tools (preferred—Power BI) DP-203 Certified.
Posted 1 day ago
12.0 years
5 - 9 Lacs
Hyderābād
On-site
Job Description Overview PepsiCo Data BI & Integration Platforms is seeking an experienced Cloud Platform Databricks SME, responsible for overseeing the Platform administration, Security, new NPI tools integration, migrations, platform maintenance and other platform administration activities on Azure/AWS.The ideal candidate will have hands-on experience with Azure/AWS services – Infrastructure as Code (IaC), platform provisioning & administration, cloud network design, cloud security principles and automation. Responsibilities Databricks Subject Matter Expert (SME) plays a pivotal role in admin, security best practices, platform sustain support, new tools adoption, cost optimization, supporting new patterns/design solutions using the Databricks platform. Here’s a breakdown of typical responsibilities: Core Technical Responsibilities Architect and optimize big data pipelines using Apache Spark, Delta Lake, and Databricks-native tools. Design scalable data ingestion and transformation workflows, including batch and streaming (e.g., Kafka, Spark Structured Streaming). Create integration guidelines to configure and integrate Databricks with other existing security tools relevant to data access control. Implement data security and governance using Unity Catalog, access controls, and data classification techniques. Support migration of legacy systems to Databricks on cloud platforms like Azure, AWS, or GCP. Manage cloud platform operations with a focus on FinOps support, optimizing resource utilization, cost visibility, and governance across multi-cloud environments. Collaboration & Advisory Act as a technical advisor to data engineering and analytics teams, guiding best practices and performance tuning. Partner with architects and business stakeholders to align Databricks solutions with enterprise goals. Lead proof-of-concept (PoC) initiatives to demonstrate Databricks capabilities for specific use cases. Strategic & Leadership Contributions Mentor junior engineers and promote knowledge sharing across teams. Contribute to platform adoption strategies, including training, documentation, and internal evangelism. Stay current with Databricks innovations and recommend enhancements to existing architectures. Specialized Expertise (Optional but Valuable) Machine Learning & AI integration using MLflow, AutoML, or custom models. Cost optimization and workload sizing for large-scale data processing. Compliance and audit readiness for regulated industries. Qualifications Bachelor’s degree in computer science. At least 12 years of experience in IT cloud infrastructure, architecture and operations, including security, with at least 5 years in a Platform admin role Strong understanding of data security principles and best practices. Expertise in Databricks platform, security features, Unity Catalog, and data access control mechanisms. Experience with data classification and masking techniques. Strong understanding of cloud cost management, with hands-on experience in usage analytics, budgeting, and cost optimization strategies across multi-cloud platforms. Strong knowledge of cloud architecture, design, and deployment principles and practices, including microservices, serverless, containers, and DevOps. Deep expertise in Azure/AWS big data & analytics technologies, including Databricks, real time data ingestion, data warehouses, serverless ETL, No SQL databases, DevOps, Kubernetes, virtual machines, web/function apps, monitoring and security tools. Deep expertise in Azure/AWS networking and security fundamentals, including network endpoints & network security groups, firewalls, external/internal DNS, load balancers, virtual networks and subnets. Proficient in scripting and automation tools, such as PowerShell, Python, Terraform, and Ansible. Excellent problem-solving, analytical, and communication skills, with the ability to explain complex technical concepts to non-technical audiences. Certifications in Azure/AWS/Databricks platform administration, networking and security are preferred. Strong self-organization, time management and prioritization skills A high level of attention to detail, excellent follow through, and reliability Strong collaboration, teamwork and relationship building skills across multiple levels and functions in the organization Ability to listen, establish rapport, and credibility as a strategic partner vertically within the business unit or function, as well as with leadership and functional teams Strategic thinker focused on business value results that utilize technical solutions Strong communication skills in writing, speaking, and presenting Capable to work effectively in a multi-tasking environment. Fluent in English language.
Posted 1 day ago
10.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Acuity Knowledge Partners (Acuity) is a leading provider of bespoke research, analytics and technology solutions to the financial services sector, including asset managers, corporate and investment banks, private equity and venture capital firms, hedge funds and consulting firms. Its global network of over 6,000 analysts and industry experts, combined with proprietary technology, supports more than 600 financial institutions and consulting companies to operate more efficiently and unlock their human capital, driving revenue higher and transforming operations. Acuity is headquartered in London and operates from 10 locations worldwide. The company fosters a diverse, equitable and inclusive work environment, nurturing talent, regardless of race, gender, ethnicity or sexual orientation. Acuity was established as a separate business from Moody’s Corporation in 2019, following its acquisition by Equistone Partners Europe (Equistone). In January 2023, funds advised by global private equity firm Permira acquired a majority stake in the business from Equistone, which remains invested as a minority shareholder. For more information, visit www.acuitykp.com Position Title- Associate Director (Senior Architect – Data) Department-IT Location- Gurgaon/ Bangalore Job Summary The Enterprise Data Architect will enhance the company's strategic use of data by designing, developing, and implementing data models for enterprise applications and systems at conceptual, logical, business area, and application layers. This role advocates data modeling methodologies and best practices. We seek a skilled Data Architect with deep knowledge of data architecture principles, extensive data modeling experience, and the ability to create scalable data solutions. Responsibilities include developing and maintaining enterprise data architecture, ensuring data integrity, interoperability, security, and availability, with a focus on ongoing digital transformation projects. Key Responsibilities Strategy & Planning Develop and deliver long-term strategic goals for data architecture vision and standards in conjunction with data users, department managers, clients, and other key stakeholders. Create short-term tactical solutions to achieve long-term objectives and an overall data management roadmap. Establish processes for governing the identification, collection, and use of corporate metadata; take steps to assure metadata accuracy and validity. Establish methods and procedures for tracking data quality, completeness, redundancy, and improvement. Conduct data capacity planning, life cycle, duration, usage requirements, feasibility studies, and other tasks. Create strategies and plans for data security, backup, disaster recovery, business continuity, and archiving. Ensure that data strategies and architectures are aligned with regulatory compliance. Develop a comprehensive data strategy in collaboration with different stakeholders that aligns with the transformational projects’ goals. Ensure effective data management throughout the project lifecycle. Acquisition & Deployment Ensure the success of enterprise-level application rollouts (e.g. ERP, CRM, HCM, FP&A, etc.) Liaise with vendors and service providers to select the products or services that best meet company goals Operational Management o Assess and determine governance, stewardship, and frameworks for managing data across the organization. o Develop and promote data management methodologies and standards. o Document information products from business processes and create data entities o Create entity relationship diagrams to show the digital thread across the value streams and enterprise o Create data normalization across all systems and data base to ensure there is common definition of data entities across the enterprise o Document enterprise reporting needs develop the data strategy to enable single source of truth for all reporting data o Address the regulatory compliance requirements of each country and ensure our data is secure and compliant o Select and implement the appropriate tools, software, applications, and systems to support data technology goals. o Oversee the mapping of data sources, data movement, interfaces, and analytics, with the goal of ensuring data quality. o Collaborate with project managers and business unit leaders for all projects involving enterprise data. o Address data-related problems regarding systems integration, compatibility, and multiple-platform integration. o Act as a leader and advocate of data management, including coaching, training, and career development to staff. o Develop and implement key components as needed to create testing criteria to guarantee the fidelity and performance of data architecture. o Document the data architecture and environment to maintain a current and accurate view of the larger data picture. o Identify and develop opportunities for data reuse, migration, or retirement. Data Architecture Design: Develop and maintain the enterprise data architecture, including data models, databases, data warehouses, and data lakes. Design and implement scalable, high-performance data solutions that meet business requirements. Data Governance: Establish and enforce data governance policies and procedures as agreed with stakeholders. Maintain data integrity, quality, and security within Finance, HR and other such enterprise systems. Data Migration: Oversee the data migration process from legacy systems to the new systems being put in place. Define & Manage data mappings, cleansing, transformation, and validation to ensure accuracy and completeness. Master Data Management: Devise processes to manage master data (e.g., customer, vendor, product information) to ensure consistency and accuracy across enterprise systems and business processes. Provide data management (create, update and delimit) methods to ensure master data is governed Stakeholder Collaboration: Collaborate with various stakeholders, including business users, other system vendors, and stakeholders to understand data requirements. Ensure the enterprise system meets the organization's data needs. Training and Support: Provide training and support to end-users on data entry, retrieval, and reporting within the candidate enterprise systems. Promote user adoption and proper use of data. 10 Data Quality Assurance: Implement data quality assurance measures to identify and correct data issues. Ensure the Oracle Fusion and other enterprise systems contain reliable and up-to-date information. Reporting and Analytics: Facilitate the development of reporting and analytics capabilities within the Oracle Fusion and other systems Enable data-driven decision-making through robust data analysis. Continuous Improvement: Continuously monitor and improve data processes and the Oracle Fusion and other system's data capabilities. Leverage new technologies for enhanced data management to support evolving business needs. Technology and Tools: Oracle Fusion Cloud Data modeling tools (e.g., ER/Studio, ERwin) ETL tools (e.g., Informatica, Talend, Azure Data Factory) Data Pipelines: Understanding of data pipeline tools like Apache Airflow and AWS Glue. Database management systems: Oracle Database, MySQL, SQL Server, PostgreSQL, MongoDB, Cassandra, Couchbase, Redis, Hadoop, Apache Spark, Amazon RDS, Google BigQuery, Microsoft Azure SQL Database, Neo4j, OrientDB, Memcached) Data governance tools (e.g., Collibra, Informatica Axon, Oracle EDM, Oracle MDM) Reporting and analytics tools (e.g., Oracle Analytics Cloud, Power BI, Tableau, Oracle BIP) Hyperscalers / Cloud platforms (e.g., AWS, Azure) Big Data Technologies such as Hadoop, HDFS, MapReduce, and Spark Cloud Platforms such as Amazon Web Services, including RDS, Redshift, and S3, Microsoft Azure services like Azure SQL Database and Cosmos DB and experience in Google Cloud Platform services such as BigQuery and Cloud Storage. Programming Languages: (e.g. using Java, J2EE, EJB, .NET, WebSphere, etc.) SQL: Strong SQL skills for querying and managing databases. Python: Proficiency in Python for data manipulation and analysis. Java: Knowledge of Java for building data-driven applications. Data Security and Protocols: Understanding of data security protocols and compliance standards. Key Competencies Qualifications: Education: Bachelor’s degree in computer science, Information Technology, or a related field. Master’s degree preferred. Experience: 10+ years overall and at least 7 years of experience in data architecture, data modeling, and database design. Proven experience with data warehousing, data lakes, and big data technologies. Expertise in SQL and experience with NoSQL databases. Experience with cloud platforms (e.g., AWS, Azure) and related data services. Experience with Oracle Fusion or similar ERP systems is highly desirable. Skills: Strong understanding of data governance and data security best practices. Excellent problem-solving and analytical skills. Strong communication and interpersonal skills. Ability to work effectively in a collaborative team environment. Leadership experience with a track record of mentoring and developing team members. Excellent in documentation and presentations. Good knowledge of applicable data privacy practices and laws. Certifications: Relevant certifications (e.g., Certified Data Management Professional, AWS Certified Big Data – Specialty) are a plus. Behavioral A self-starter, an excellent planner and executor and above all, a good team player Excellent communication skills and inter-personal skills are a must Must possess organizational skills, including multi-task capability, priority setting and meeting deadlines Ability to build collaborative relationships and effectively leverage networks to mobilize resources Initiative to learn business domain is highly desirable Likes dynamic and constantly evolving environment and requirements
Posted 1 day ago
10.0 - 18.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Building on our past. Ready for the future Worley is a global professional services company of energy, chemicals and resources experts. We partner with customers to deliver projects and create value over the life of their assets. We’re bridging two worlds, moving towards more sustainable energy sources, while helping to provide the energy, chemicals and resources needed now. The Role As a Data Science Lead with Worley, you will work closely with our existing team to deliver projects for our clients while continuing to develop your skills and experience etc…. Conceptualise, build and manage AI/ML (more focus on unstructured data) platform by evaluating and selecting best in industry AI/ML tools and frameworks Drive and take ownership for developing cognitive solutions for internal stakeholders & external customers. Conduct research in various areas like Explainable AI, Image Segmentation,3D object detections and Statistical Methods Evaluate not only algorithms & models but also available tools & technologies in the market to maximize organizational spend. Utilize the existing frameworks, standards, patterns to create architectural foundation and services necessary for AI/ML applications that scale from multi-user to enterprise class. Analyse marketplace trends - economical, social, cultural and technological - to identify opportunities and create value propositions. Offer a global perspective in stakeholder discussions and when shaping solutions/recommendations Analyse marketplace trends - economical, social, cultural and technological - to identify opportunities and create value propositions. Offer a global perspective in stakeholder discussions and when shaping solutions/recommendations IT Skills & Experience Thorough understanding of complete AI/ML project life cycle to establish processes & provide guidance & expert support to the team. Expert knowledge of emerging technologies in Deep Learning and Reinforcement Learning Knowledge of MLOps process for efficient management of the AI/ML projects. Must have lead project execution with other data scientists/ engineers for large and complex data sets Understanding of machine learning algorithms, such as k-NN, GBM, Neural Networks Naive Bayes, SVM, and Decision Forests. Experience in the AI/ML components like Jupyter Hub, Zeppelin Notebook, Azure ML studio, Spark ML lib, TensorFlow, Tensor flow,Keras, Py-Torch and Sci-Kit Learn etc Strong knowledge of deep learning with special focus on CNN/R-CNN/LSTM/Encoder/Transformer architecture Hands-on experience with large networks like Inseption-Resnets,ResNeXT-50. Demonstrated capability using RNNS for text, speech data, generative models Working knowledge of NoSQL (GraphX/Neo4J), Document, Columnar and In-Memory database models Working knowledge of ETL tools and techniques, such as Talend,SAP BI Platform/SSIS or MapReduce etc. Experience in building KPI /storytelling dashboards on visualization tools like Tableau/Zoom data People Skills Professional and open communication to all internal and external interfaces. Ability to communicate clearly and concisely and a flexible mindset to handle a quickly changing culture Strong analytical skills. Industry Specific Experience 10 -18 Years of experience of AI/ML project execution and AI/ML research Education – Qualifications, Accreditation, Training Master or Doctroate degree Computer Science Engineering/Information Technology /Artificial Intelligence Moving forward together We’re committed to building a diverse, inclusive and respectful workplace where everyone feels they belong, can bring themselves, and are heard. We provide equal employment opportunities to all qualified applicants and employees without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other basis as protected by law. We want our people to be energized and empowered to drive sustainable impact. So, our focus is on a values-inspired culture that unlocks brilliance through belonging, connection and innovation. And we're not just talking about it; we're doing it. We're reskilling our people, leveraging transferable skills, and supporting the transition of our workforce to become experts in today's low carbon energy infrastructure and technology. Whatever your ambition, there’s a path for you here. And there’s no barrier to your potential career success. Join us to broaden your horizons, explore diverse opportunities, and be part of delivering sustainable change. Company Worley Primary Location IND-MM-Navi Mumbai Other Locations IND-KR-Bangalore, IND-MM-Mumbai, IND-MM-Pune, IND-TN-Chennai, IND-GJ-Vadodara, IND-AP-Hyderabad, IND-WB-Kolkata Job Digital Platforms & Data Science Schedule Full-time Employment Type Employee Job Level Experienced Job Posting Jul 4, 2025 Unposting Date Aug 3, 2025 Reporting Manager Title Head of Data Intelligence
Posted 1 day ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough