Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 7.0 years
20 - 25 Lacs
Gurugram
Work from Office
Our Purpose Title and Summary Senior Analyst, Big Data Analytics & Engineering Overview: Job Title: Sr. Analyst, Data Engineering, Value Quantification Team (Based in Pune, India) About Mastercard: Mastercard is a global technology leader in the payments industry, committed to powering an inclusive, digital economy that benefits everyone, everywhere. By leveraging secure data, cutting-edge technology, and innovative solutions, we empower individuals, financial institutions, governments, and businesses to achieve their potential. Our culture is driven by our Decency Quotient (DQ), ensuring inclusivity, respect, and integrity guide everything we do. Operating across 210+ countries and territories, Mastercard is dedicated to building a sustainable world with priceless opportunities for all. Position Overview: This is a techno-functional position that combines strong technical skills with a deep understanding of business needs and requirements with 5-7 years or experience. The role focuses on developing and maintaining advanced data engineering solutions for pre-sales value quantification within the Services business unit. As a Sr. Analyst, you will be responsible for creating and optimizing data pipelines, managing large datasets, and ensuring the integrity and accessibility of data to support Mastercard s internal teams in quantifying the value of services, enhancing customer engagement, and driving business outcomes. The role requires close collaboration across teams to ensure data solutions meet business needs and deliver measurable impact. Role Responsibilities: Data Engineering & Pipeline Development: Develop and maintain robust data pipelines to support the value quantification process. Utilize tools such as Apache NiFi, Azure Data Factory, Pentaho, Talend, SSIS, and Alteryx to ensure efficient data integration and transformation. Data Management and Analysis: Manage and analyze large datasets using SQL, Hadoop, and other database management systems. Perform data extraction, transformation, and loading (ETL) to support value quantification efforts. Advanced Analytics Integration: Use advanced analytics techniques, including machine learning algorithms, to enhance data processing and generate actionable insights. Leverage programming languages such as Python (Pandas, NumPy, PySpark) and Impala for data analysis and model development. Business Intelligence and Reporting: Utilize business intelligence platforms such as Tableau and Power BI to create insightful dashboards and reports that communicate the value of services. Generate actionable insights from data to inform strategic decisions and provide clear, data-backed recommendations. Cross-Functional Collaboration & Stakeholder Engagement: Collaborate with Sales, Marketing, Consulting, Product, and other internal teams to understand business needs and ensure successful data solution development and deployment. Communicate insights and data value through compelling presentations and dashboards to senior leadership and internal teams, ensuring tool adoption and usage. All About You: Data Engineering Expertise: Proficiency in data engineering tools and techniques to develop and maintain data pipelines. Experience with data integration tools such as Apache NiFi, Azure Data Factory, Pentaho, Talend, SSIS, and Alteryx. Advanced SQL Skills: Strong skills in SQL for querying and managing large datasets. Experience with database management systems and data warehousing solutions. Programming Proficiency: Knowledge of programming languages such as Python (Pandas, NumPy, PySpark) and Impala for data analysis and model development. Business Intelligence and Reporting: Experience in creating insightful dashboards and reports using business intelligence platforms such as Tableau and Power BI. Statistical Analysis: Ability to perform statistical analysis to identify trends, correlations, and insights that support strategic decision-making. Cross-Functional Collaboration: Strong collaboration skills to work effectively with Sales, Marketing, Consulting, Product, and other internal teams to understand business needs and ensure successful data solution development and deployment. Communication and Presentation: Excellent communication skills to convey insights and data value through compelling presentations and dashboards to senior leadership and internal teams. Execution Focus: A results-driven mindset with the ability to balance strategic vision with tactical execution, ensuring that data solutions are delivered on time and create measurable business value. Education: Bachelor s degree in Data Science, Computer Science, Business Analytics, Economics, Finance, or a related field. Advanced degrees or certifications in analytics, data science, AI/ML, or an MBA are preferred. Why Us? At Mastercard, you ll have the opportunity to shape the future of internal operations by leading the development of data engineering solutions that empower teams across the organization. Join us to make a meaningful impact, drive business outcomes, and help Mastercard s internal teams create better customer engagement strategies through innovative value-based ROI narratives. Location: Gurgaon/Pune, India Employment Type: Full-Time
Posted 3 weeks ago
8.0 - 15.0 years
15 - 20 Lacs
Chennai
Work from Office
Job Title: Data Architecture Location: Chennai Experience: 8-15 Years Key Responsibilites: Design and implement data-centric solutions on Google Cloud Platform (GCP) using various GCP tools like Big Query, Google Cloud Storage, Cloud SQL, Memory Store, Dataflow, Dataproc, Artifact Registry, Cloud Build, Cloud Run, Vertex AI, Pub/Sub, GCP APIs. Build ETL pipelines to ingest the data from heterogeneous sources into our system Develop data processing pipelines using programming languages like Java and Python to extract, transform, and load (ETL) data Create and maintain data models, ensuring efficient storage, retrieval, and analysis of large datasets Deploy and manage databases, both SQL and NoSQL, such as Bigtable, Firestore, or Cloud SQL, based on project requirements Optimize data workflows for performance, reliability, and cost-effectiveness on the GCP infrastructure. Implement version control and CI/CD practices for data engineering workflows to ensure reliable and efficient deployments. Utilize GCP monitoring and logging tools to proactively identify and address performance bottlenecks and system failures Troubleshoot and resolve issues related to data processing, storage, and retrieval. Promptly address code quality issues using SonarQube, Checkmarx, Fossa, and Cycode throughout the development lifecycle Implement security measures and data governance policies to ensure the integrity and confidentiality of data Collaborate with stakeholders to gather and define data requirements, ensuring alignment with business objectives. Develop and maintain documentation for data engineering processes, ensuring knowledge transfer and ease of system maintenance. Participate in on-call rotations to address critical issues and ensure the reliability of data engineering systems. Provide mentorship and guidance to junior team members, fostering a collaborative and knowledge-sharing environment
Posted 3 weeks ago
1.0 - 3.0 years
4 - 8 Lacs
Thane
Work from Office
Contract Type: Regular If the chemistry is right, we can make a difference at LANXESS: speed up sports, make beverages last longer, add more color to leisure time and much more. As a leading specialty chemicals group, we develop and produce chemical intermediates, additives, specialty chemicals and high-tech plastics. With more than 13,000 employees. Be part of it! Job Highlights Development & Application Maintenance *Design, develop, and maintain scalable applications using Java and JavaScript frameworks. *Participate in both frontend and backend development, ensuring seamless integration and user experience. Development & Application Maintenance *Design, develop, and maintain scalable applications using Java and JavaScript frameworks. *Participate in both frontend and backend development, ensuring seamless integration and user experience. Python: *Hands-on experience in developing automation scripts, backend services, and data processing pipelines using Python. Familiar with libraries like Pandas, NumPy, and Flask, and adept at writing clean, modular, and testable code for a variety of use cases including web development and data-driven applications. SAP ABAP Development & Customization *Develop and maintain custom SAP solutions using ABAP. *Work closely with functional teams to design technical solutions based on business requirements in SAP modules. Requirements Bachelor of Engineering / Bachelor of Science 1 to 3 years Java, JavaScript, C, C++ Python ABAP What we offer you Compensation: We offer competitive compensation packages, inclusive of a global bonus program and an individual performance bonus program. Comprehensive Benefits: We provide a mixture of various benefits to support your financial security, health and wellbeing including retirement plans, health programs, life insurance and medical care. Work-Life & Flexibility: We support you in maintaining a balance between working hours and personal life. With our global Xwork program, we offer flexible working arrangements in all countries in which we operate. Training & Development: We are committed to your professional and personal development and encourage you in the ongoing pursuit of education, training and knowledge through both formal and informal learning. Diversity: For us, talent matters, we welcome everyone who commits to our values. We strongly believe that including diverse perspectives makes us more innovative and enhances our competitiveness. Therefore, we embrace the uniqueness of every single individual and are truly committed to supporting our people in developing their individual potential. Join the LANXESS team!
Posted 3 weeks ago
10.0 - 12.0 years
20 - 25 Lacs
Pune
Work from Office
This team is responsible for Business Analytics at Seagate. About the role - you will: Responsible for SAP BODS Support and Development projects. Main tasks include the requirements analysis, conception, implementation/development of solution as per requirement. Work closely with different Cross-functional teams to develop solutions related to BODS. Architect, Develop & maintain BODS jobs. Design, develop complex dataflow and workflows. Responsible for delivering from offshore on time, on schedule, within scope and adopting industry best practice and quality. About you: Excellent verbal and written communication skills, Analytical skills. Well versed of working with offshore/onsite model. Ability to articulate and clearly communicate complex problems and solutions in a simple, logical and impactful manner. in virtual collaboration environment. Good at problem solving/team player. Your experience includes: Good development experience in SAP BODS tool. Experience in design and development of ETL dataflows and jobs. Experience on data integration from SAP and non-SAP to SAP BW4HANA and Enterprise HANA using SAP Business Objects Data Services (BODS). Good experience on delta data processing concepts. Experience on all transformations of Data Services like Map Operation, Table Comparison, Row-Generation, History Preserving, Query and SQL transformation etc. Experience on integration of non-SAP/Cloud systems with SAP BW4HANA using Data Services. Experience in SQL/PLSQL. Good to have BODS administration experience. Good to have SAP BW knowledge and experience. Knowledge on SDI/SDA/Informatica will be plus. Location: Our site in Pune is dynamic, both in our cutting-edge, innovative work, as well as our vibrant on-site food, and athletic and personal development opportunities for our employees. You can enjoy breakfast, lunch, or dinner from one of four cafeterias in the park. Take a break from your workday and participate in one of our many walkathons or compete against your colleagues in carrom, chess and table tennis. Learn about a technical topic outside your area of expertise at one of our monthly Technical Speaker Series, or attend one of the frequent on-site cultural festivals, celebrations, and community volunteer opportunities. Location : Pune, India Travel : None
Posted 3 weeks ago
2.0 - 6.0 years
5 - 9 Lacs
Pune, Gurugram
Work from Office
We are seeking a highly skilled Development Lead with expertise in Generative AI and Large Language models to join our dynamic team. As a Development Lead, you will play a key role in developing cutting-edge AI models and systems for our clients. Your primary focus will be on driving innovation and leveraging generative AI techniques to create impactful solutions. The ideal candidate will have a strong technical background and a passion for pushing the boundaries of AI technology. Job Description: Responsibilities : Develop and implement creative experiences, campaigns, apps, and digital products, leveraging generative AI technologies at their core. Successful leadership and delivery of projects involving Cloud Gen-AI Platforms and Cloud AI Services, Data Pre-processing, Cloud AI PaaS Solutions, LLMs, Base Foundation Models, Fine Tuned models, working with a variety of different LLMs and LLM APIs. Conceptualize, Design, build and develop experiences and solutions which demonstrate the minimum required functionality within tight timelines. Collaborate with creative technology leaders and cross-functional teams to test feasibility of new ideas, help refine and validate client requirements and translate them into working prototypes, and from thereon to scalable Gen-AI solutions. Research and explore emerging trends and techniques in the field of generative AI and LLMs to stay at the forefront of innovation. Research and explore new products, platforms, and frameworks in the field of generative AI on an ongoing basis and stay on top of this very dynamic, evolving field Design and optimize Gen-AI Apps for efficient data processing and model leverage. Implement LLMOps processes, and the ability to manage Gen-AI apps and models across the lifecycle from prompt management to results evaluation. Evaluate and fine-tune models to ensure high performance and accuracy. Collaborate with engineers to develop and integrate AI solutions into existing systems. Stay up-to-date with the latest advancements in the field of Gen-AI and contribute to the companys technical knowledge base. Must-Have: Strong Expertise in Python development, and the Python Dev ecosystem, including various frameworks/libraries for front-end and back-end Python dev, data processing, API integration, and AI/ML solution development. Minimum 2 years hands-on experience in working with Generative AI Applications and Solutions. Minimum 2 years hands-on experience in working with Large Language Models Solid understanding of Transformer Models and how they work. Reasonable understanding of Diffusion Models and how they work. Hands-on Experience with building production solutions using a variety of different. Experience with multiple LLMs and models - including GPT-4o, Gemini, Claude, Llama, etc. Deep Experience and Expertise in Cloud Gen-AI platforms, services, and APIs, including Azure OpenAI, AWS Bedrock , and/or GCP Vertex AI . Solid Hands-on Experience working with Enterprise RAG technologies and solutions / frameworks - including LangChain, Llama Index, etc. Solid Hands-on Experience with developing end-to-end RAG Pipelines . Solid Hands-on Experience with Agent-driven Gen-AI architectures and solutions, and working with AI Agents . Hands-on experience with Single-Agent and Multi-Agent Orchestration solutions Solid Hands-on Experience with AI and LLM Workflows Experience with LLM model registries (Hugging Face), LLM APIs, embedding models, etc. Experience with vector databases (Azure AI Search, AWS Kendra, FAISS, Milvus etc.). Experience in data preprocessing , and post-processing model / results evaluation. Hands-on Experience with Diffusion Models and AI. Art models including SDXL, DALL-E 3, Adobe Firefly, Midjourney, is highly desirable. Hands-on Experience with Image Processing and Creative Automation at scale, using AI models. Hands-on experience with image and media transformation and adaptation at scale, using AI Art and Diffusion models. Hands-on Experience with dynamic creative use cases, using AI Art and Diffusion Models. Hands-on Experience with Fine-Tuning LLM models at scale. Hands-on Experience with Fine-Tuning Diffusion models and Fine-tuning techniques such as LoRA for AI Art models as well. Hands-on Experience with AI Speech models and services, including Text-to-Speech and Speech-to-Text. Ability to lead design and development teams, for Full-Stack Gen-AI Apps and Products/Solutions, built on LLMs and Diffusion models. Ability to lead design and development for Creative Experiences and Campaigns , built on LLMs and Diffusion models. Nice-to-Have: Good Background and Foundation with Machine Learning solutions and algorithms Experience with designing, developing, and deploying production-grade machine learning solutions. Experience with Generative Adversarial Networks (GANs) and Variational Autoencoders (VAEs). Experience with custom ML model development and deployment Proficiency in deep learning frameworks such as TensorFlow, PyTorch, or Keras. Strong knowledge of machine learning algorithms and their practical applications. Experience with Cloud ML Platforms such as Azure ML Service, AWS Sage maker, and NVidia AI Foundry. Hands-on Experience with Video Generation models. Hands-on Experience with 3D Generation Models. Location: DGS India - Pune - Kharadi EON Free Zone Brand: Dentsu Creative Time Type: Full time Contract Type: Permanent
Posted 3 weeks ago
2.0 - 4.0 years
4 - 8 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
An experienced Life Science graduate with 2-4 years of expertise in medical record reviewing and Quality Control. As a Senior Medical Summary Reviewer, you will be responsible for reviewing, summarizing, and ensuring the quality and accuracy of medical records. You will play a key role in quality control, providing oversight to ensure compliance with established standards. This role offers the opportunity to support quality assurance efforts while leveraging your experience in medical records review within a collaborative team environment. On-site work opportunity in our Chennai office. Responsibilities Review and summarize complex medical records with a high level of accuracy. Perform quality control checks to ensure thorough and accurate case evaluations. Provide feedback and guidance to junior team members to maintain high-quality output. Collaborate with cross-functional teams to ensure adherence to timelines. Ensure compliance with confidentiality and data protection standards. Qualifications Bachelor s degree in Life Sciences or a related field. 2-4 years of experience in medical records review and quality control. Strong attention to detail and familiarity with medical terminology. Proven ability to handle complex cases and provide constructive feedback. Our Cultural Values Entrepreneurs at heart, we are a customer first team sharing one goal and one vision. We seek team members who are: Humble - No one is above another; we all work together to meet our clients needs and we acknowledge our own weaknesses Hungry - We all are driven internally to be successful and to continually expand our contribution and impact Smart - We use emotional intelligence when working with one another and with clients Our culture shapes our actions, our products, and the relationships we forge with our customers. Who We Are KLDiscovery provides technology-enabled services and software to help law firms, corporations, government agencies and consumers solve complex data challenges. The company, with offices in 26 locations across 17 countries, is a global leader in delivering best-in-class eDiscovery, information governance and data recovery solutions to support the litigation, regulatory compliance, internal investigation and data recovery and management needs of our clients. Serving clients for over 30 years, KLDiscovery offers data collection and forensic investigation, early case assessment, electronic discovery and data processing, application software and data hosting for web-based document reviews, and managed document review services. In addition, through its global Ontrack Data Recovery business, KLDiscovery delivers world-class data recovery, email extraction and restoration, data destruction and tape management. KLDiscovery has been recognized as one of the fastest growing companies in North America by both Inc. Magazine (Inc. 5000) and Deloitte (Deloitte s Technology Fast 500). Additionally, KLDiscovery is an Orange-level Relativity Best in Service Partner, a Relativity Premium Hosting Partner and maintains ISO/IEC 27001 Certified data centers. KLDiscovery is an Equal Opportunity Employer. #LI-KV1 #LI-Onsite
Posted 3 weeks ago
2.0 - 6.0 years
14 - 15 Lacs
Bengaluru
Work from Office
Its fun to work in a company where people truly believe in what theyre doing! At BlackLine, were committed to bringing passion and customer focus to the business of enterprise applications. Since being founded in 2001, BlackLine has become a leading provider of cloud software that automates and controls the entire financial close process. Our vision is to modernize the finance and accounting function to enable greater operational effectiveness and agility, and we are committed to delivering innovative solutions and services to empower accounting and finance leaders around the world to achieve Modern Finance. Being a best-in-class SaaS Company, we understand that bringing in new ideas and innovative technology is mission critical. At BlackLine we are always working with new, cutting edge technology that encourages our teams to learn something new and expand their creativity and technical skillset that will accelerate their careers. Work, Play and Grow at BlackLine! Make Your Mark: We are looking for a motivated and enthusiastic DataOps Engineer to join our growing data team. In this role, you will be instrumental in bridging the gap between data engineering, operations, and development, ensuring our data pipelines & data infrastructure is robust, reliable, and scalable. If you have a passion for automating data processes, streamlining deployments, and maintaining healthy data ecosystems, we encourage you to apply. Youll Get To: Develop and Maintain Data Pipelines: Assist in the design, development, and maintenance of scalable and efficient ETL (Extract, Transform, Load) processes to ingest, transform, and load data from various sources into our data warehouse. Orchestrate Workflows: Implement and manage data workflows using Apache Airflow, ensuring timely execution and monitoring of data jobs. Containerization and Orchestration: Utilize Docker and Kubernetes to containerize data applications and services, and manage their deployment and scaling in production environments. Cloud Infrastructure Management & Data warehousing : Work with Google Cloud Platform (GCP) & snowflake services to deploy, manage, and optimize data infrastructure components , including performance tuning and data governance. Scripting and Automation: Develop and maintain Python scripts for data processing, automation, and operational tasks. CI/CD Implementation: Contribute to the development and improvement of our CI/CD pipelines for data applications, ensuring efficient and reliable deployments. System Administration: Provide basic Unix/Linux administration support for data infrastructure, including scripting, monitoring, and troubleshooting. Monitoring and Alerting: Help implement and maintain monitoring solutions for data pipelines and infrastructure, responding to and resolving incidents. Collaboration: Work closely with Data Engineers, Data Scientists, and other stakeholders to understand data requirements and deliver reliable data solutions. Documentation: Contribute to the documentation of data pipelines, processes, and operational procedures. What Youll Bring: 2-6 years of professional experience in a DataOps , Data Engineering, or a similar role. Proficiency in Python & SQL for data scripting and automation. Familiarity with ETL concepts & tools like airflow, dbt , and experience in building data pipelines. Hands-on experience with Docker & Kubernetes for containerization. Experience with Apache Airflow for workflow orchestration. Working knowledge of at least one major cloud platform, preferably Google Cloud Platform (GCP). Basic Unix/Linux administration skills. Familiarity with CI/CD principles and tools. Strong problem-solving skills and a proactive approach to identifying and resolving issues. Excellent communication and collaboration skills. Bachelors degree in Computer Science , Engineering, or a related field (or equivalent practical experience). We re Even More Excited If You Have: Experience with other data orchestration tools. Knowledge of data governance and data quality best practices. Contributions to open-source projects. Experience in an agile development environment. Thrive at BlackLine Because You Are Joining: A technology-based company with a sense of adventure and a vision for the future. Every door at BlackLine is open. Just bring your brains, your problem-solving skills, and be part of a winning team at the worlds most trusted name in Finance Automation! A culture that is kind, open, and accepting. Its a place where people can embrace what makes them unique, and the mix of cultural backgrounds and varying interests cultivates diverse thought and perspectives. A culture where BlackLiners continued growth and learning is empowered. BlackLine offers a wide variety of professional development seminars and inclusive affinity groups to celebrate and support our diversity. BlackLine is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity or expression, race, ethnicity, age, religious creed, national origin, physical or mental disability, ancestry, color, marital status, sexual orientation, military or veteran status, status as a victim of domestic violence, sexual assault or stalking, medical condition, genetic information, or any other protected class or category recognized by applicable equal employment opportunity or other similar laws. BlackLine recognizes that the ways we work and the workplace itself have shifted. We innovate in a workplace that optimizes a combination of virtual and in-person interactions to maximize collaboration and nurture our culture. Candidates who live within a reasonable commute to one of our offices will work in the office at least 2 days a week. 2-6 years of professional experience in a DataOps , Data Engineering, or a similar role. Proficiency in Python & SQL for data scripting and automation. Familiarity with ETL concepts & tools like airflow, dbt , and experience in building data pipelines. Hands-on experience with Docker & Kubernetes for containerization. Experience with Apache Airflow for workflow orchestration. Working knowledge of at least one major cloud platform, preferably Google Cloud Platform (GCP). Basic Unix/Linux administration skills. Familiarity with CI/CD principles and tools. Strong problem-solving skills and a proactive approach to identifying and resolving issues. Excellent communication and collaboration skills. Bachelors degree in Computer Science , Engineering, or a related field (or equivalent practical experience).
Posted 3 weeks ago
4.0 - 8.0 years
7 - 11 Lacs
Pune, Gurugram
Work from Office
We are seeking a highly skilled Development Lead with expertise in Generative AI and Large Language models to join our dynamic team. As a Development Lead, you will play a key role in developing cutting-edge AI models and systems for our clients. Your primary focus will be on driving innovation and leveraging generative AI techniques to create impactful solutions. The ideal candidate will have a strong technical background and a passion for pushing the boundaries of AI technology. Job Description: Responsibilities : Develop and implement creative experiences, campaigns, apps, and digital products, leveraging generative AI technologies at their core. Successful leadership and delivery of projects involving Cloud Gen-AI Platforms and Cloud AI Services, Data Pre-processing, Cloud AI PaaS Solutions, LLMs, Base Foundation Models, Fine Tuned models, working with a variety of different LLMs and LLM APIs. Conceptualize, Design, build and develop experiences and solutions which demonstrate the minimum required functionality within tight timelines. Collaborate with creative technology leaders and cross-functional teams to test feasibility of new ideas, help refine and validate client requirements and translate them into working prototypes, and from thereon to scalable Gen-AI solutions. Research and explore emerging trends and techniques in the field of generative AI and LLMs to stay at the forefront of innovation. Research and explore new products, platforms, and frameworks in the field of generative AI on an ongoing basis and stay on top of this very dynamic, evolving field Design and optimize Gen-AI Apps for efficient data processing and model leverage. Implement LLMOps processes, and the ability to manage Gen-AI apps and models across the lifecycle from prompt management to results evaluation. Evaluate and fine-tune models to ensure high performance and accuracy. Collaborate with engineers to develop and integrate AI solutions into existing systems. Stay up-to-date with the latest advancements in the field of Gen-AI and contribute to the companys technical knowledge base. Must-Have: Strong Expertise in Python development, and the Python Dev ecosystem, including various frameworks/libraries for front-end and back-end Python dev, data processing, API integration, and AI/ML solution development. Minimum 2 years hands-on experience in working with Generative AI Applications and Solutions. Minimum 2 years hands-on experience in working with Large Language Models Solid understanding of Transformer Models and how they work. Reasonable understanding of Diffusion Models and how they work. Hands-on Experience with building production solutions using a variety of different. Experience with multiple LLMs and models - including GPT-4o, Gemini, Claude, Llama, etc. Deep Experience and Expertise in Cloud Gen-AI platforms, services, and APIs, including Azure OpenAI, AWS Bedrock , and/or GCP Vertex AI . Solid Hands-on Experience working with Enterprise RAG technologies and solutions / frameworks - including LangChain, Llama Index, etc. Solid Hands-on Experience with developing end-to-end RAG Pipelines . Solid Hands-on Experience with Agent-driven Gen-AI architectures and solutions, and working with AI Agents . Hands-on experience with Single-Agent and Multi-Agent Orchestration solutions Solid Hands-on Experience with AI and LLM Workflows Experience with LLM model registries (Hugging Face), LLM APIs, embedding models, etc. Experience with vector databases (Azure AI Search, AWS Kendra, FAISS, Milvus etc.). Experience in data preprocessing , and post-processing model / results evaluation. Hands-on Experience with Diffusion Models and AI. Art models including SDXL, DALL-E 3, Adobe Firefly, Midjourney, is highly desirable. Hands-on Experience with Image Processing and Creative Automation at scale, using AI models. Hands-on experience with image and media transformation and adaptation at scale, using AI Art and Diffusion models. Hands-on Experience with dynamic creative use cases, using AI Art and Diffusion Models. Hands-on Experience with Fine-Tuning LLM models at scale. Hands-on Experience with Fine-Tuning Diffusion models and Fine-tuning techniques such as LoRA for AI Art models as well. Hands-on Experience with AI Speech models and services, including Text-to-Speech and Speech-to-Text. Ability to lead design and development teams, for Full-Stack Gen-AI Apps and Products/Solutions, built on LLMs and Diffusion models. Ability to lead design and development for Creative Experiences and Campaigns , built on LLMs and Diffusion models. Nice-to-Have: Good Background and Foundation with Machine Learning solutions and algorithms Experience with designing, developing, and deploying production-grade machine learning solutions. Experience with Generative Adversarial Networks (GANs) and Variational Autoencoders (VAEs). Experience with custom ML model development and deployment Proficiency in deep learning frameworks such as TensorFlow, PyTorch, or Keras. Strong knowledge of machine learning algorithms and their practical applications. Experience with Cloud ML Platforms such as Azure ML Service, AWS Sage maker, and NVidia AI Foundry. Hands-on Experience with Video Generation models. Hands-on Experience with 3D Generation Models. Location: DGS India - Pune - Kharadi EON Free Zone Brand: Dentsu Creative Time Type: Full time Contract Type: Permanent
Posted 3 weeks ago
11.0 - 16.0 years
16 - 20 Lacs
Noida
Work from Office
Embark on an exciting journey with us and open the door to limitless opportunities for career growth and personal development Greetings from Texala!!! Texala is a forward-thinking technology company, focused on delivering cutting-edge solutions across multiple industries. We are dedicated to innovation, scalability, and excellence, helping businesses achieve their full potential through robust digital transformations. As we continue to grow, we are seeking a Python Architect to join our dynamic team. If you are passionate about software architecture, design patterns, and want to work on exciting projects, this is the perfect opportunity for you! Roles Responsibilities Architectural Leadership : Expertise in leading the architectural design and development of complex, web-based, multi-tenant ERP systems. Strong knowledge of best practices, design patterns, and architectural principles for building scalable, fault-tolerant, and highly available systems on cloud platforms such as AWS. Ability to define and implement comprehensive architectural frameworks that align with business objectives and technology strategies. Technical Proficiency: Expert-level proficiency in Python and related libraries/frameworks such as Flask, Django, and Celery. Strong experience in designing and developing RESTful APIs using Python web frameworks. In-depth understanding of microservices architecture and hands-on experience in building and deploying microservices. Proficiency with containerization technologies such as Docker and orchestration tools like Kubernetes. Solution Development : Proven ability to design and implement end-to-end solutions that meet business requirements and ensure system reliability and scalability. Experience in developing fault-tolerant systems and disaster recovery mechanisms to minimize downtime and ensure data integrity. Strong capability to optimize system performance and ensure efficient data processing and responsiveness. Leadership and Collaboration : Demonstrated leadership skills in providing technical guidance and direction to development teams. Ability to drive innovation, foster a culture of excellence, and promote continuous improvement in software engineering practices. Strong collaboration skills to work effectively with cross-functional teams, ensuring seamless integration and delivery of system components. Performance Optimization : Expertise in identifying and implementing performance optimization strategies for software applications. Ability to analyze system performance metrics and develop solutions to enhance system efficiency and responsiveness. Knowledge of monitoring tools and techniques to ensure continuous system performance and reliability. Monitoring and Maintenance: : Strong experience in developing and implementing monitoring, logging, and alerting solutions for large-scale systems. Capability to ensure continuous improvement of application products and observability services. Proficiency in setting up and maintaining robust CI/CD pipelines to streamline development and deployment processes. Communication and Problem-Solving: : Excellent verbal and written communication skills to effectively convey technical concepts and solutions to stakeholders. Strong analytical and problem-solving skills to address complex technical challenges and drive efficient solutions. Ability to adapt to changing business needs and technology landscapes, maintaining a proactive approach to learning and development. Cloud and Database Management: : Proficiency with cloud platforms such as AWS, including services like Lambda, Kinesis, SQS, and SNS. Strong understanding of RDBMS databases and NoSQL databases and experience in integrating multiple data sources into a unified system. Knowledge of telemetry services and metrics for monitoring CPU, memory, disk space, and network latency. Technical Skills Required Architectural Design: : Leading the architectural design and development of complex software systems using Python technologies. Ensuring adherence to best practices, design patterns, and architectural principles. Building scalable, fault-tolerant, and highly available systems on AWS or other cloud platforms. Solution Development: : Designing and implementing end-to-end solutions that meet business requirements. Implementing fault-tolerant systems and disaster recovery mechanisms to minimize downtime. Ensuring uninterrupted data access and performance optimization of the products and observability platform. Python Proficiency: : Strong experience in Python and its libraries/frameworks such as Flask, FastAPI, Django, Celery, and more. Expert-level proficiency in designing and developing RESTful APIs using Python web frameworks. Knowledge of microservices architecture and experience in building microservices using Python. Leadership and Collaboration: : Providing technical leadership and guidance to development teams. Driving innovation and fostering a culture of excellence in software engineering. Continuous improvement of application products and observability services (logging, monitoring, and alerting). Performance Optimization: : Identifying and implementing performance optimization strategies for software applications developed using Python technologies. Ensuring efficient data processing and system responsiveness. Bonuses based on individual performance. Top-of-line medical benefits, health insurance. Freedom for Innovation and ideas. A five-star accommodation for bachelor. A personal development plan to help you (and us) grow. A fun and light work environment with serious responsibilities. We require pre-employment background and drug screening. Profile Shortlisting Online Aptitude/Technical Evaluation Apply for this position Fill out the form and well be in touch as soon as possible.
Posted 3 weeks ago
4.0 - 5.0 years
6 - 10 Lacs
Vadodara
Work from Office
Internal Job Title: BI Development Analyst Business: Lucy Electric Manufacturing & Technologies India Pvt Ltd Location: Halol, Vadodara, Gujarat Job Reference No: 3917 Job Purpose To support the provision of key business information and insights by engaging in development activities based on Microsoft Power Platform, with Power BI as the core application Job Context Working closely with the Data & Analytics Development Lead and cross-functional teams to ensure a coordinated approach to Business Intelligence delivery The role involves providing information across multiple businesses for comparative and predictive analysis, highlighting opportunities for business process improvement Job Dimensions The role is a hybrid role, with flexible attendance at our office in Vadodara, India, to support business engagement There is an occasional need to visit other sites and business partners at their premises to build stakeholder relationships or to attend specific industry events, globally. Key Accountabilities These will include: Capturing requirements and preparing specifications for BI dataflows, datasets, semantic models, reports and dashboards Developing prioritised Business Intelligence outputs to agreed quality and security standards Assisting the Data & Analytics Development Lead with technical integration of data sources Conducting training and coaching sessions to support business users understanding of data Collaborating with the wider business to promote appropriate use of data & analytics tools Maintaining operational and customer-facing documentation for support processes and defined project deliverables Improving analytics capabilities for Business Intelligence services in an evergreen ecosystem Troubleshooting production issues and coordinate with others to resolve incidents and complete tasks using IT Service Management tools, as part of a cross-functional team Qualifications, Experience & Skills A bachelor s degree (or equivalent professional qualifications and experience) in a relevant stream Effective communication skills in the global Business Language, English 4-5 years of experience in developing interactive Power BI dashboards with advance knowledge in DAX functions, showcasing a strong portfolio of reports and dashboards Good understanding of SQL, ERP Systems (Dynamics 365), Dataverse, Azure Blob, Data Lake, Power Platform Capability to deconstruct existing business system reports and redesign in Power BI or similar tools, and maintaining documentation including the business logic and data dictionary Working knowledge of statistical methods including Exploratory Data Analysis to validate findings, ensure data accuracy and drive data-driven decision making Experience of design reviews against existing guidelines to propose enhancements, conceptualise and design the best fit solution against requirements Proficiency in providing Business as Usual and Ad-Hoc support General understanding of a company s value chain and basic manufacturing industry terminology Good to Have Skills: Copilot and Q&A in Power BI, ETL using Data Pipeline tools, REST API s, CI/CD on Azure DevOps, Data Governance tools, Near Time and Real Time data processing Behavioral Competencies Good interpersonal skills to enable process improvement through positive interaction with internal and external parties Keen problem solver with desire to share knowledge and support others, demonstrating active listening and empathy towards their views and concerns Customer-oriented, able to work flexibly in a changing business landscape, striving for stakeholder satisfaction About Us: Lucy Group Ltd is the parent company of all Lucy Group companies. Since its origins in Oxford, UK, over 200 years ago, the Group has grown and diversified. The Group s businesses help to advance the transition to a carbon-free world with infrastructure that enables renewable energy, electric vehicles, smart city management and sustainable living. Today we employ in excess of 1,600 people worldwide, with operations in the UK, Saudi Arabia, UAE, India, South Africa, Brazil, Thailand, Malaysia, India and East Africa. Lucy Electric is an international leader in intelligent secondary power distribution products and solutions, with remote operation and monitoring. Linking energy generation to consumption, the business specialises in high-performance medium- and low-voltage switchgear for utility, industrial and commercial applications. Key products include Ring Main Units and package substations. Does this sound interesting? We would love to hear from you. Our application process in quick and easy. Apply today!
Posted 3 weeks ago
8.0 - 12.0 years
35 - 45 Lacs
Chennai
Work from Office
STAFF ENGINEER (Accounts Payable) Toast is a technology company that specializes in providing a comprehensive all-in-one SaaS product and financial technology solutions tailored for the restaurant industry. Toast offers a suite of tools to help restaurants manage their operations, including point of sale, payment processing, supplier management, digital ordering and delivery, marketing and loyalty, employee scheduling and team management. The platform is designed to streamline operations, enhance customer experiences, and improve overall efficiency for the restaurants. Are you bready* for a change? As a Staff Engineer on the Accounts Payable team you will be responsible for developing and maintaining back-end systems that support AP operations, automating processes, enhancing user interfaces, and integrating various systems. In this role, you will work on architecting, developing, and maintaining backend systems and services that support our business and technical goals. You will collaborate closely with product managers, frontend engineers, and other stakeholders to deliver high-quality, scalable, and reliable backend solutions. Join us to improve our platform and add the next generation of products. About this roll* (Responsibilities) As a Staff Engineer on our team, you will: Be part of a team working collaboratively with UX, PM, QA and other engineers designing, building and maintaining high performance, flexible and highly scalable Saas applications Lead technical initiatives, mentor junior engineers, and provide guidance on best practices for backend development. Champion design reviews and help drive the technical direction of the team. Develop automated workflows for invoice processing, payment approvals, and vendor management. Optimize query performance and ensure data integrity within large datasets. Implement machine learning or Optical Character Recognition (OCR) to streamline data extraction from invoices and minimize manual intervention. Lead, mentor and coach engineers on best in class industry standard development best practices Collaborate with other engineering teams to ensure that developed solutions are scalable, reliable, and secure. Use cutting-edge technologies and best practices to optimize for performance and usability, ultimately enhancing the overall restaurant management experience. Advocate best coding practices to raise the bar for you, your team and the company Dedicated to building a high-quality, reliable, and high-performing framework for reporting, analytics, and insights on toast platform Document solution design, write & review code, test and rollout solutions to production, Work with PM in capturing & actioning customer feedback to iteratively enhance customer experience Propose and implement improvements to enhance system efficiency, scalability, and user experience. Present findings and insights to senior leadership and stakeholders. Passionate about making users happy and seeing people use your product in the wild. Do you have the right ingredients*? (Requirements) 8+ years of hands on experience delivering high quality, reliable services / software development using C#, Java, Kotlin or other object-oriented languages Build and maintain RESTful APIs, GraphQL endpoints, or other integrations with internal and external services. Design, optimize, and maintain relational (SQL) and NoSQL databases (SQL Server, Postgres, DynamoDB). Work on data modeling, query optimization, and performance tuning. Identify bottlenecks, optimize application performance, and scale backend systems to handle high traffic and large data volumes. Strong experience with automated testing (unit, integration, end-to-end tests) and test-driven development (TDD). Proficient with data warehousing solutions such as Snowflake, Redshift, or BigQuery. Experience working in a team with Agile/Scrum methodology Must have experience supporting and debugging large distributed applications. Experience in monitoring, troubleshooting, and improve system performance through logging and metrics Familiarity with data platforms to process large datasets for scalable data processing will be a plus Strong problem-solving skills, with the ability to identify, diagnose, and resolve complex technical issues. Excellent communication skills to work with both technical and non-technical stakeholders. Self-motivated, with a passion for learning and staying current with new technologies. A minimum of a bachelor's degree is required. This role follows a hybrid work model, requiring a minimum of two days per week in the office
Posted 3 weeks ago
1.0 - 9.0 years
4 - 8 Lacs
Hyderabad
Work from Office
Career Category Information Systems Job Description ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world s toughest diseases, and make people s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what s known today. ABOUT THE ROLE Role Description: As part of the cybersecurity organization, In this vital role you will be responsible for designing, building, and maintaining data infrastructure to support data-driven decision-making. This role involves working with large datasets, developing reports, executing data governance initiatives, and ensuring data is accessible, reliable, and efficiently managed. The role sits at the intersection of data infrastructure and business insight delivery, requiring the Data Engineer to design and build robust data pipelines while also translating data into meaningful visualizations for stakeholders across the organization. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture, ETL processes, and cybersecurity data frameworks. Roles Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing. Be a key team member that assists in design and development of the data pipeline. Build data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems. Develop and maintain interactive dashboards and reports using tools like Tableau, ensuring data accuracy and usability Schedule and manage workflows the ensure pipelines run on schedule and are monitored for failures. Collaborate with multi-functional teams to understand data requirements and design solutions that meet business needs. Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency. Implement data security and privacy measures to protect sensitive data. Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions. Collaborate and communicate effectively with product teams. Collaborate with data scientists to develop pipelines that meet dynamic business needs. Share and discuss findings with team members practicing SAFe Agile delivery model. What we expect of you We are all different, yet we all use our unique contributions to serve patients. The Data engineer professional we seek is one with these qualifications. Basic Qualifications: Master s degree and 1 to 3 years of experience of Computer Science, IT or related field experience OR Bachelor s degree and 3 to 5 years of Computer Science, IT or related field experience OR Diploma and 7 to 9 years of Computer Science, IT or related field experience Preferred Qualifications: Hands on experience with data practices, technologies, and platforms, such as Databricks, Python, GitLab, LucidChart, etc. Hands-on experience with data visualization and dashboarding tools Tableau, Power BI, or similar is a plus Proficiency in data analysis tools (e. g. SQL) and experience with data sourcing tools Excellent problem-solving skills and the ability to work with large, complex datasets Understanding of data governance frameworks, tools, and best practices Knowledge of and experience with data standards (FAIR) and protection regulations and compliance requirements (e. g. , GDPR, CCPA) Good-to-Have Skills: Experience with ETL tools and various Python packages related to data processing, machine learning model development Strong understanding of data modeling, data warehousing, and data integration concepts Knowledge of Python/R, Databricks, cloud data platforms Experience working in Product teams environment Experience working in an Agile environment Professional Certifications: AWS Certified Data Engineer preferred Databricks Certificate preferred Soft Skills: Initiative to explore alternate technology and approaches to solving problems Skilled in breaking down problems, documenting problem statements, and estimating efforts Excellent analytical and troubleshooting skills Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to handle multiple priorities successfully Team-oriented, with a focus on achieving team goals EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. .
Posted 3 weeks ago
1.0 - 10.0 years
16 - 18 Lacs
Chennai
Work from Office
Experience in SonarQube, CICD, Tekton, terraform, GCS, GCP Looker, Google cloud build, cloud run, Vertex AI, Airflow, TensorFlow, etc. , Experience in Train, Build and Deploy ML, DL Models Experience in HuggingFace, Chainlit, React Ability to understand technical, functional, non-functional, security aspects of business requirements and delivering them end-to-end. Ability to adapt quickly with opensource products & tools to integrate with ML Platforms Building and deploying Models (Scikit learn, DataRobots, TensorFlow PyTorch, etc. ) Developing and deploying On-Prem & Cloud environments Kubernetes, Tekton, OpenShift, Terraform, Vertex AI Experience in LLM models like PaLM, GPT4, Mistral (open-source models), Work through the complete lifecycle of Gen AI model development, from training and testing to deployment and performance monitoring. Developing and maintaining AI pipelines with multimodalities like text, image, audio etc. Have implemented in real-world Chat bots or conversational agents at scale handling different data sources. Experience in developing Image generation/translation tools using any of the latent diffusion models like stable diffusion, Instruct pix2pix. Expertise in handling large scale structured and unstructured data. Efficiently handled large-scale generative AI datasets and outputs. Familiarity in the use of Docker tools, pipenv/conda/poetry env Comfort level in following Python project management best practices (use of cxzsetup. py, logging, pytests, relative module imports, sphinx docs, etc. , ) Familiarity in use of Github (clone, fetch, pull/push, raising issues and PR, etc. , ) High familiarity in the use of DL theory/practices in NLP applications Comfort level to code in Huggingface, LangChain, Chainlit, Tensorflow and/or Pytorch, Scikit-learn, Numpy and Pandas Comfort level to use two/more of open source NLP modules like SpaCy, TorchText, fastai. text, farm-haystack, and others Knowledge in fundamental text data processing (like use of regex, token/word analysis, spelling correction/noise reduction in text, segmenting noisy unfamiliar sentences/phrases at right places, deriving insights from clustering, etc. , ) Have implemented in real-world BERT/or other transformer fine-tuned models (Seq classification, NER or QA) from data preparation, model creation and inference till deployment Use of GCP services like BigQuery, Cloud function, Cloud run, Cloud Build, VertexAI, Good working knowledge on other open source packages to benchmark and derive summary Experience in using GPU/CPU of cloud and on-prem infrastructures Skillset to leverage cloud platform for Data Engineering, Big Data and ML needs. Use of Dockers (experience in experimental docker features, docker-compose, etc. , ) Familiarity with orchestration tools such as airflow, Kubeflow Experience in CI/CD, infrastructure as code tools like terraform etc. Kubernetes or any other containerization tool with experience in Helm, Argoworkflow, etc. , Ability to develop APIs with compliance, ethical, secure and safe AI tools. Good UI skills to visualize and build better applications using Gradio, Dash, Streamlit, React, Django, etc. , Deeper understanding of javascript, css, angular, html, etc. , is a plus. Education : Bachelor s or Master s Degree in Computer Science, Engineering, Maths or Science Performed any modern NLP/LLM courses/open competitions is also welcomed. Design NLP/LLM/GenAI applications/products by following robust coding practices, Explore SoTA models/techniques so that they can be applied for automotive industry usecases Conduct ML experiments to train/infer models; if need be, build models that abide by memory & latency restrictions, Deploy REST APIs or a minimalistic UI for NLP applications using Docker and Kubernetes tools Showcase NLP/LLM/GenAI applications in the best way possible to users through web frameworks (Dash, Plotly, Streamlit, etc. , ) Converge multibots into super apps using LLMs with multimodalities Develop agentic workflow using Autogen, Agentbuilder, langgraph Build modular AI/ML products that could be consumed at scale.
Posted 3 weeks ago
9.0 - 12.0 years
16 - 18 Lacs
Mumbai
Work from Office
Job Description: Essential Job Functions: Participate in data engineering tasks, including data processing and transformation. Assist in the development and maintenance of data pipelines and infrastructure. Collaborate with team members to support data collection and integration. Contribute to data quality and security efforts. Analyze data using data engineering tools and techniques. Collaborate with data engineers and analysts on data-related projects. Pursue opportunities to enhance data engineering skills and knowledge. Stay updated on data engineering trends and best practices. Basic Qualifications: Bachelors degree in a relevant field or equivalent combination of education and experience Typically, 4+ years of relevant work experience in industry, with a minimum of 1+ years in a similar role Proven experience in data engineering Proficiencies in data engineering tools and technologies A continuous learner that stays abreast with industry knowledge and technology Other Qualifications (a plus): Advanced degree in a relevant field a plus Relevant certifications, such as Certified Data Analyst or SAS Certified Big Data Professional a plus Recruitment fraud is a scheme in which fictitious job opportunities are offered to job seekers typically through online services, such as false websites, or through unsolicited emails claiming to be from the company. These emails may request recipients to provide personal information or to make payments as part of their illegitimate recruiting process. DXC does not make offers of employment via social media networks and DXC never asks for any money or payments from applicants at any point in the recruitment process, nor ask a job seeker to purchase IT or other equipment on our behalf. More information on employment scams is available here .
Posted 3 weeks ago
4.0 - 6.0 years
5 - 12 Lacs
Mumbai
Work from Office
Job Description of Subject Matter Expert (SME) Job Title : SME Reporting to : Team Manager/Team leader, Operations Objectives The SME’s objective is to actively assist a team in accordance with laid down procedures to achieve and maintain requisite standards of quality and productivity. He / she will report to the Team leader/Team Manager, Operations, who will be the first point of contact for any issues, questions, or concerns. Key Result Areas (KRAs) Operations: SME needs to be an expert in US Mortgage Loss Mitigation process (End to End) especially in Loan Document Intake, Trial Payment Plan Monitoring, Mod Fulfilment and identifying all kinds of Loss Mitigation documents. 4+ years’ experience working with US Mortgage Loss Mitigation in a servicing/collection’s environment. 1+ years of experience working with Fannie Mae, Freddie Mac, and Government guidelines required. The SME is responsible for maintaining constant Loss Mitigation end to end process knowledge in the team. Maintaining and documenting all process and investor updates received from the business area. Will be a part of clients calls as required and will be responsible to update the teams alongside the Supervisors. Regular cascading / providing training/ refreshers sessions on updates latest updates received from business and investors to all team members. Conduct the regular knowledge checks with the teams. Actively involved in suggesting and driving process improvements. Should have multi-tasking skills as a Trainer / QC / Processor and conduct refresher trainings / handle Quality sessions as per process requirement. Monitor and coach underperformers to improve their quality and efficiency. Be a part of the regular production and produce agreed numbers. Address and ensure resolution on all process related queries of the team members. Qualification: Diploma / Graduate any discipline. 4 - 5 years of experience in BPO, US Mortgage/ Data Processing background. Minimum of 2 years in the Sr. Loss Mitigation -Process Expert role. Skill Sets Experience in Loss Mitigation, foreclosure, bankruptcy, and mortgage servicing life cycle. Should have knowledge of banking industry rules and regulations, and government regulations regarding Loss Mitigation. The ability to multitask and follow mortgage-servicing guidelines accurately is imperative. Need to be well-versed with US Regulatory and investor guidelines. Good interpersonal skills Good written and verbal Communication skills Analytical and good judging skills Ability to grasp and learn quickly. Ability to coach Self-motivated MS Office Knowledge Ability to plan. Should be able to prioritize the daily work. Flexibility to work in different shifts. US -Mortgage Certification will be a value add.
Posted 3 weeks ago
3.0 - 7.0 years
3 - 7 Lacs
Hyderabad / Secunderabad, Telangana, Telangana, India
On-site
We are looking for a highly motivated and expert Senior Data Engineer who can own the design & development of complex data pipelines, solutions, and frameworks. The ideal candidate will be responsible for designing, developing, and optimizing data pipelines, data integration frameworks, and metadata-driven architectures that enable seamless data access and analytics. This role requires deep expertise in big data processing, distributed computing, data modeling, and governance frameworks to support self-service analytics, AI-driven insights, and enterprise-wide data management. Roles & Responsibilities Design, develop, and maintain scalable ETL/ELT pipelines to support structured, semi-structured, and unstructured data processing across the Enterprise Data Fabric. Implement real-time and batch data processing solutions, integrating data from multiple sources into a unified, governed data fabric architecture. Optimize big data processing frameworks using Apache Spark, Hadoop, or similar distributed computing technologies to ensure high availability and cost efficiency. Work with metadata management and data lineage tracking tools to enable enterprise-wide data discovery and governance. Ensure data security, compliance, and role-based access control (RBAC) across data environments. Optimize query performance, indexing strategies, partitioning, and caching for large-scale datasets. Develop CI/CD pipelines for automated data pipeline deployments, version control, and monitoring. Implement data virtualization techniques to provide seamless access to data across multiple storage systems. Collaborate with cross-functional teams, including data architects, business analysts, and DevOps teams, to align data engineering strategies with enterprise goals. Stay up-to-date with emerging data technologies and best practices, ensuring continuous improvement of Enterprise Data Fabric architectures. Functional Skills Must-Have Skills Hands-on experience in data engineering technologies such as Databricks, PySpark, SparkSQL, Apache Spark, AWS, Python, SQL, and Scaled Agile methodologies. Proficiency in workflow orchestration and performance tuning on big data processing. Strong understanding of AWS services. Experience with Data Fabric, Data Mesh, or similar enterprise-wide data architectures. Ability to quickly learn, adapt, and apply new technologies. Strong problem-solving and analytical skills. Excellent communication and teamwork skills. Experience with Scaled Agile Framework (SAFe), Agile delivery practices, and DevOps practices. Good-to-Have Skills Deep expertise in Biotech & Pharma industries. Experience in writing APIs to make the data available to the consumers. Experienced with SQL/NoSQL databases, and vector databases for large language models. Experienced with data modeling and performance tuning for both OLAP and OLTP databases. Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven, etc.), automated unit testing, and DevOps. Education and Professional Certifications 9 to 12 years of Computer Science, IT, or related field experience. AWS Certified Data Engineer (Preferred). Databricks Certificate (Preferred). Scaled Agile SAFe certification (Preferred). Soft Skills Excellent analytical and troubleshooting skills. Strong verbal and written communication skills. Ability to work effectively with global, virtual teams. High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Ability to learn quickly, be organized, and detail-oriented. Strong presentation and public speaking skills.
Posted 3 weeks ago
4.0 - 9.0 years
4 - 9 Lacs
Gurgaon / Gurugram, Haryana, India
On-site
In our Ordering team in Sales Technology entity, we are looking for a Data Analyst for an IT application in Ab Initio ETL environment This application interfaces with 25 other applications to send data through 7 data processing chains in the sales, billing and delivery domains The batch processes are scheduled and run during the night For this applicative context, you will participate in the Think, Build and Run phases for data integration flows You will be responsible for those tasks : Participate in study, development, and support phases for data processing solutions with Ab Initio Analyze business requirements and contribute to design customized solutions Design, develop, and optimize ETL workflows using Ab Initio Test ETL flows and ensure development quality Provide documentation of developed solutions and technical processes Manage job scheduling with VTOM and optimize execution Maintain and optimize existing ETL processes Manage incident tickets, analyze and resolve technical issues related to data processing Give technical and functional support for deployed solutions Required profile : Hard skills : Development skills in Ab Initio (GDE, Express>It, ) Experience with VTOM for job scheduling and batch process management Knowledge in Unix environments and Shell scripting Good test practice (unit test, integration test, performance test) Incident management skills Soft skills : Excellent analytical and issue solving skills Skills in transversal communication with other teams Rigor, autonomy, and initiative Strong organizational skills and capacity to manage multiple requests simultaneously Capacity to work quickly in case of urgent issues
Posted 3 weeks ago
10.0 - 20.0 years
20 - 30 Lacs
Gurgaon / Gurugram, Haryana, India
On-site
Job description Should have extensive experience on the technical architecture, configuration options, and customization capabilities development experience for about 10-12 years Proven experience of successful implementation end to end IoT based solution in the industry like manufacturing, Retail & Pharma Experience in IoT based Smart City solution implementing End to End using Garnet Framewok Proficiency in cloud platforms (AWS, Azure, Google Cloud) Sound knowledge of microservices architecture and containerization (Docker, Kubernetes). Strong development skills using languages (Python,Java,C#,C++,Node.js,JSON, XML, and binary data formats), data processing (Kafka, Spark & Flink) and network configuration. Good to have proficiency in Big Data having hands on knowledge in Scala, and R for data analysis and manipulation and knowledge of NoSQL databases (e.g., MongoDB, Cassandra) and relational databases (e.g., MySQL, PostgreSQL) for data storage and retrieval
Posted 3 weeks ago
2.0 - 6.0 years
2 - 6 Lacs
Mumbai, Maharashtra, India
On-site
The purpose of this role is to build and maintain data in the businesss operational and analytics databases. The Data Engineer works with the businesss software engineers, data analytics teams, data scientists. data warehouse engineers and business intelligence experts and data visualization specialists to understand and aid in the implementation of database requirements, enhance performances by data driven analysis, build reporting and BI dashboards and troubleshoot data issues. The purpose of this role is to maintain, improve, clean and manipulate data in the businesss operational and analytics databases. The Data Engineer works with the businesss software engineers, data analytics teams, data scientists. data warehouse engineers and business intelligence experts and data visualization specialists to understand and aid in the implementation of database requirements, enhance performances by data driven analysis, build reporting and BI dashboards and troubleshoot data issues.
Posted 3 weeks ago
1.0 - 4.0 years
10 - 14 Lacs
Pune
Work from Office
Overview Design, develop, and maintain data pipelines and ETL/ELT processes using PySpark/Databricks/bigquery/Airflow/composer. Optimize performance for large datasets through techniques such as partitioning, indexing, and Spark optimization. Collaborate with cross-functional teams to resolve technical issues and gather requirements. Responsibilities Ensure data quality and integrity through data validation and cleansing processes. Analyze existing SQL queries, functions, and stored procedures for performance improvements. Develop database routines like procedures, functions, and views/MV. Participate in data migration projects and understand technologies like Delta Lake/warehouse/bigquery. Debug and solve complex problems in data pipelines and processes. Qualifications Bachelor’s degree in computer science, Engineering, or a related field. Strong understanding of distributed data processing platforms like Databricks and BigQuery. Proficiency in Python, PySpark, and SQL programming languages. Experience with performance optimization for large datasets. Strong debugging and problem-solving skills. Fundamental knowledge of cloud services, preferably Azure or GCP. Excellent communication and teamwork skills. Nice to Have: Experience in data migration projects. Understanding of technologies like Delta Lake/warehouse. What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com
Posted 3 weeks ago
2.0 - 4.0 years
2 - 4 Lacs
Noida, Uttar Pradesh, India
On-site
Responsibilities Software Development : Write clean, maintainable, and efficient code for various software applications and systems. Design and Architecture : Participate in design reviews with peers and stakeholders. Code Review : Review code developed by other developers, providing feedback and adhering to industry-standard best practices like coding guidelines. Testing : Build testable software, define tests, participate in the testing process, and automate tests using tools (e.g., JUnit, Selenium) and Design Patterns, leveraging the test automation pyramid as the guide. Debugging and Troubleshooting : Triage defects or customer-reported issues, debug and resolve them in a timely and efficient manner. Service Health and Quality : Contribute to the health and quality of services and incidents, promptly identifying and escalating issues. Collaborate with the team in utilizing service health indicators and telemetry for action. Assist in conducting root cause analysis and implementing measures to prevent future recurrences. DevOps Model : Understand working in a DevOps Model. Begin to take ownership of working with product management on requirements to design, develop, test, deploy, and maintain the software in production. Documentation : Properly document new features, enhancements, or fixes to the product, and also contribute to training materials. Qualifications Basic Qualifications: Bachelor's degree in Computer Science, Engineering, or a related technical field, or equivalent practical experience. 2 to 4 years of professional software development experience. Proficiency in one or more programming languages such as Ruby on Rails, ReactJS, Python, Java, or JavaScript. Experience with software development practices and design patterns. Familiarity with version control systems like Git/GitHub and bug/work tracking systems like JIRA. Basic understanding of cloud technologies and DevOps principles. Strong analytical and problem-solving skills, with a proven track record of building and shipping successful software products and services. Preferred Qualifications: More than 2 years of relevant experience. Experience with cloud platforms like AWS, Azure, or GCP. Experience with test automation frameworks and tools. Exposure to database techniques/tools such as data modeling, MySQL, SQL, etc. Conversant with platforms, tools, and frameworks used in application development. Exposure to Agile/SCRUM methodology and TDD (Test Driven Development). Knowledge of agile development methodologies. Commitment to continuous learning and professional development. Good communication and interpersonal skills, with the ability to work effectively in a collaborative team environment
Posted 3 weeks ago
3.0 - 5.0 years
3 - 5 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
Develop and maintain data tables using GCP, SQL, and Snowflake; design and optimize complex SQL queries for ETL processes ensuring data accuracy and integrity. Create and manage interactive data visualizations and dashboards using Tableau, Power BI, or Looker Studio aligned with business needs. Analyze data trends to generate actionable insights and comprehensive reports supporting strategic business decisions. Handle ad-hoc data requests efficiently by delivering timely, accurate, and scalable data solutions. Collaborate with stakeholders to understand business challenges and translate open-ended questions into analytical tasks and solutions. Design wireframes and mockups for data visualization projects, ensuring user-friendly and effective communication of data. Communicate findings clearly to both technical and non-technical audiences using presentations and storytelling techniques. Perform data manipulation and statistical analysis using Python libraries such as Pandas, NumPy, and SciPy. Implement basic machine learning models with Python (scikit-learn, TensorFlow) to enhance data analysis and interpret results. Automate data workflows and processes using Python scripts to improve efficiency and reliability, with ongoing process improvements.
Posted 3 weeks ago
3.0 - 5.0 years
3 - 5 Lacs
Mumbai, Maharashtra, India
On-site
Develop and maintain data tables using GCP, SQL, and Snowflake; design and optimize complex SQL queries for ETL processes ensuring data accuracy and integrity. Create and manage interactive data visualizations and dashboards using Tableau, Power BI, or Looker Studio aligned with business needs. Analyze data trends to generate actionable insights and comprehensive reports supporting strategic business decisions. Handle ad-hoc data requests efficiently by delivering timely, accurate, and scalable data solutions. Collaborate with stakeholders to understand business challenges and translate open-ended questions into analytical tasks and solutions. Design wireframes and mockups for data visualization projects, ensuring user-friendly and effective communication of data. Communicate findings clearly to both technical and non-technical audiences using presentations and storytelling techniques. Perform data manipulation and statistical analysis using Python libraries such as Pandas, NumPy, and SciPy. Implement basic machine learning models with Python (scikit-learn, TensorFlow) to enhance data analysis and interpret results. Automate data workflows and processes using Python scripts to improve efficiency and reliability, with ongoing process improvements.
Posted 3 weeks ago
6.0 - 11.0 years
6 - 11 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
The role is within the centralised FP&A function in the UK and will be responsible for all aspects of the centralised FP&A activities of the Media Practice Area. This includes the standardisation, automation and improvement of models and processes than underpin the FP&A objectives to provide further efficiencies. In terms of specific accountability, the role, leading the UK media Bangalore offshore team, will be responsible for all forecasts, budgets and monthly analysis to provide insights and analysis covering key focus areas such as revenue, margin, costs and client profitability. The role will need to develop strong relationships with Commercial Finance Leads, Client Leads and Operational Finance to maximise profitability and reduce business risk. The responsibilities include ensuring forecasts and financial analysis are accurate, timely, forward-looking, aligned to business objectives and deliver high quality actionable insights to Market, Practice Area and Brand/ Channel Teams. They will be responsible for adapting all aspects of FP&A to reflect any changes in the Practice Area or the wider Business environment and be seen as the go to expert for all Media Practice Area FP&A matters. The role will report into Bangalore lead for planning CoE and will support in continuous improvement, automation and transformation initiatives for the Media FP&A function for UK FP&A. Job Description: Planning (Budgeting & Forecasting) Responsible for the accuracy, completeness and timely submission of forecasts and budgets for the Practice Area together with associated insight Ownership of all models and processes used in the preparation, review and analysis of Practice Area forecasts (at Practice Area/Brand/Channel level), including identification of risk and remedial actions as appropriate Ownership of the budgeting and forecasting process for the Practice Area including establishment of timetables to meet the wider UK timetable, incorporating and co-ordinating relevant inputs from Commercial Finance and other stakeholders Ownership of build and roll-forward Practice Area forecasts and budget models, including improving and building integrations with source data systems such as D365, Salesforce, Workday and other service-line specific systems Build strong relationships with Commercial Finance and the business to ensure the timely delivery of forecasts that accurately reflect the business outlook. This includes facilitating key meetings with Commercial Finance and the business to understand the strategic direction, goals and performance of the Practice Area and ensure this is reflected in the budget and forecasts Liaise with Commercial Finance and Client Accounting to incorporate contractual changes (where applicable) and any foreseen risk and opportunities into the forecasts Participate in Practice Area-level target-setting with Commercial Finance with final sign off by Commercial Finance Ensure timely and accurate budget and forecast submission to SAC Partner with Commercial Finance to prepare content and analysis for presentations Support Commercial Finance in building out and delivering multi-year strategic plans Reporting & Analysis (inc. Month End) Deliver best in class financial information and analysis to both the Director of FP&A and Commercial Finance to facilitate more informed and data driven decisions Work with Financial Control to identify and remedy any gaps in accruals, determine monthly provisioning and propose re-allocation journals Work with Client Accounting and Assurance to ensure client reporting requirements are met Deliver timely and accurate actualisation of forecasts at month end using data from source systems Prepare month end reporting and analysis for review with Commercial Finance Work collaboratively with Commercial Finance team in preparation of presentation decks Ownership of current client revenue and models and reports providing insights and comparisons to Commercial Finance and Leadership Manage the ongoing development and maintenance of the relevant data sources to provide accurate insights into client performance Regularly deliver ad-hoc analysis to Commercial Finance and the wider FP&A teams to support continual improvement of profitability, working capital and cash conversion analysis across the business Process Efficiencies Underpinning all activities is a desire to improve current processes with tangible progress made across simplification, standardization and automation leveraging technology/AI where appropriate Experience and Qualifications Qualified accountant (ACA/ACCA/CIMA or equivalent) with extensive experience in a similar finance role Some industry experience in financial planning and analysis preferable Experience of using business intelligence tools is helpful Skills Forecasting and problem-solving mind set Advanced Excel and modelling skills, with demonstrable experience of and improving systems and processes Negotiation, influence and financial acumen Proven ability to work well in a fast-paced environment and manage and prioritise multiple, conflicting deadlines under pressure and navigate effectively amidst ambiguity and change Proven leadership and team management skills Demonstrate excellent communication and interpersonal skills across a wide range of stakeholders, exhibiting relationship building capabilities and influencing collaborative outcomes Possess a drive for continuous improvement and performance excellence in their area of responsibility
Posted 3 weeks ago
2.0 - 4.0 years
2 - 4 Lacs
Mumbai, Maharashtra, India
On-site
The individual Contributor role brings over 2-4 years of experience in the field of market research. Need to have experience in all aspects of data collection and field management, including sample and quota design, deployment plans, field work monitoring, managing field related issues, and meet client s field objectives. Should possess proven abilities in any of the market research data processing tool e.g. quantum, dimension etc. Should be able to debug and solve problem while execution. Should be self-learner and thinks logically which will help to speed in project execution and help achieve desired performance levels (expected out of them) This role would suit a motivated professional who enjoys fostering relationships, problem solving attitude and working as part of a team, but also embraces responsibility for their work as an individual. Job Description: Key Responsibilities: S/He will have responsibility to manage simple surveys and medium to high complexity survey with minimum help and support Should be able to understand different data processing requirement e.g. SPSS, coding, weighting etc. The person will take ownership of the assigned project(s) under limited guidance of the supervisor Keep clients and supervisor in the loop and involve them whenever there is a change in the project specs Escalate any outstanding issue to supervisor as soon as identified Ensure process documents are updated from time to time Follow all data processing and client standards across all projects. Contribute to team meetings by being prepared and sharing ideas Other Responsibilities Attend training on regular intervals to speed on execution, adhering to standards, processes, procedures involved during execution Reconcile and manage all aspects of programmatic platform updates etc. Should be multitasking Should be able to communicate well within the team on problem solving, scheduling, planning etc.
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
19947 Jobs | Dublin
Wipro
9475 Jobs | Bengaluru
EY
7894 Jobs | London
Accenture in India
6317 Jobs | Dublin 2
Amazon
6141 Jobs | Seattle,WA
Uplers
6077 Jobs | Ahmedabad
Oracle
5820 Jobs | Redwood City
IBM
5736 Jobs | Armonk
Tata Consultancy Services
3644 Jobs | Thane
Capgemini
3598 Jobs | Paris,France