Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About the Role: We are seeking a highly experienced Voice AI /ML Engineer to lead the design and deployment of real-time voice intelligence systems. This role focuses on ASR, TTS, speaker diarization, wake word detection, and building production-grade modular audio processing pipelines to power next-generation contact centre solutions, intelligent voice agents, and telecom-grade audio systems. You will work at the intersection of deep learning, streaming infrastructure, and speech/NLP technology, creating scalable, low-latency systems across diverse audio formats and real-world applications. Key Responsibilities: Voice & Audio Intelligence: Build, fine-tune, and deploy ASR models (e.g., Whisper, wav2vec2.0, Conformer) for real-time transcription. Develop and finetune high-quality TTS systems using VITS, Tacotron, FastSpeech for lifelike voice generation and cloning. Implement speaker diarization for segmenting and identifying speakers in multi-party conversations using embeddings (x-vectors/d-vectors) and clustering (AHC, VBx, spectral clustering). Design robust wake word detection models with ultra-low latency and high accuracy in noisy conditions. Real-Time Audio Streaming & Voice Agent Infrastructure: Architect bi-directional real-time audio streaming pipelines using WebSocket, gRPC, Twilio Media Streams, or WebRTC. Integrate voice AI models into live voice agent solutions, IVR automation, and AI contact center platforms. Optimize for latency, concurrency, and continuous audio streaming with context buffering and voice activity detection (VAD). Build scalable microservices to process, decode, encode, and stream audio across common codecs (e.g., PCM, Opus, μ-law, AAC, MP3) and containers (e.g., WAV, MP4). Deep Learning & NLP Architecture: Utilize transformers, encoder-decoder models, GANs, VAEs, and diffusion models, for speech and language tasks. Implement end-to-end pipelines including text normalization, G2P mapping, NLP intent extraction, and emotion/prosody control. Fine-tune pre-trained language models for integration with voice-based user interfaces. Modular System Development: Build reusable, plug-and-play modules for ASR, TTS, diarization, codecs, streaming inference, and data augmentation. Design APIs and interfaces for orchestrating voice tasks across multi-stage pipelines with format conversions and buffering. Develop performance benchmarks and optimize for CPU/GPU, memory footprint, and real-time constraints. Engineering & Deployment: Writing robust, modular, and efficient Python code Experience with Docker, Kubernetes, cloud deployment (AWS, Azure, GCP) Optimize models for real-time inference using ONNX, TorchScript, and CUDA, including quantization, context-aware inference, model caching. On device voice model deployment. Why join us? Impactful Work: Play a pivotal role in safeguarding Tanla's assets, data, and reputation in the industry. Tremendous Growth Opportunities: Be part of a rapidly growing company in the telecom and CPaaS space, with opportunities for professional development. Innovative Environment: Work alongside a world-class team in a challenging and fun environment, where innovation is celebrated. Tanla is an equal opportunity employer. We champion diversity and are committed to creating an inclusive environment for all employees. www.tanla.com
Posted 1 week ago
0 years
6 - 8 Lacs
Hyderābād
On-site
Role Responsibilities: You will partner with business and IT operations stakeholders to define, prioritize, and execute ServiceNow CMDB operational needs and enhancements to successfully deliver CMDB capabilities which includes CMDB administration, Discovery. Additional responsibilities include: Build and develop ServiceNow Discovery and Service Mapping patterns to support business and IT Administers operational support of ServiceNow ITOM to ensure it is run and maintained in keeping with industry standards and best practices and meet the business and IT needs. Establish, review and optimize new/existing processes to improve customer experience and overall delivery of business outcomes. Forge strong, collaborative relationships and build consensus among competing stakeholders across business and IT peers, leveraging exceptional communication and interpersonal skills. Required Education, Qualifications, and Experience Settings : Expertise and experience with ServiceNow CMDB product is required ServiceNow experience with discovery, Broad general knowledge of IT infrastructure topology including typical application, server & networking configurations and how they can be documented using a common data model Good Knowledge on IT Service Management Practices. Job Description: Experience with various ServiceNow modules, including Normalization Data Services, ITOM Discovery, Configuration Management Lifecycle, Common Services Data Model, Service Mapping. Understanding of ServiceNow CMDB, Identification and Reconciliation Engine, and workflow capabilities. Familiarity with platform security and integration best practices. Create, maintain, and enhance future integrations between ServiceNow and other systems. Experience in implementing Discovery and Service Mapping solutions. • Ability to perform assessments and provide standard processes for Discovery and Service Mapping. • Understanding of Discovery scheduling and potential network impacts. Experience with Configuration Item blueprint and CMDB Health dashboard configuration, including remediation of duplicate and stale CI items. • Expert understanding of import sets, transform maps, and ServiceNow data mapping. Knowledge of ServiceNow CMDB class hierarchy and its relation to Asset and Configuration Management. • Fundamental knowledge of Common Services Data Model (CSDM). Experience with additional product lines on the ServiceNow platform, such as ITSM, ITBM, and/or ITAM. QUALIFICATIONS EXPERIENCE & EDUCATION : Experience with the ServiceNow CMDB and Discovery. Detailed understanding of and experience with, ITIL processes Excellent oral, written, presentation, and communication skills Ability to effectively prioritize and execute in a high-pressure environment Ability to independently set priorities and meet deadlines in a fast- paced environment, a self-starter Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India. Benefits to help you thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 307566
Posted 1 week ago
5.0 - 8.0 years
0 Lacs
Gurugram, Haryana, India
On-site
As a MongoDB Data Engineer, you will be a key contributor in architecting, modelling, and developing data solutions using MongoDB to support our document and metadata workflows. You will collaborate closely with cross-functional teams to deliver scalable, performant, and secure data platforms, with exposure to Azure cloud infrastructure. You will play a central role in modelling document and transactional data, building aggregation and reporting pipelines, and ensuring best practices in database performance and reliability,including deploying, configuring, and tuning self-hosted MongoDB environments. You will work in a start-up-like environment but with the scale and mission of a global business behind you. The Role: Design, develop, and optimize MongoDB data models for various business and analytics use cases. Implement and maintain efficient MongoDB CRUD operations, indexes, and schema evolution strategies. Experience with self-hosted MongoDB deployments, including installation, configuration, scaling, backup/restore, and monitoring. Build and maintain reporting and analytics pipelines using MongoDB Reporting suite. Develop, monitor, and tune MongoDB (both self-hosted and cloud-managed) deployments for scalability, reliability, and security. Collaborate with engineering and product teams to translate requirements into MongoDB-backed solutions. Support integration with Azure cloud services (e.g., Azure Cosmos DB for MongoDB, Azure Functions, Blob Storage). Maintain documentation and contribute to database standards and best practices. (Nice to have) Support data ingestion and automation tasks using Python. Qualifications: Bachelor’s or master’s in computer science, Engineering, or related quantitative discipline. Experience: 5 to 8 years of hands-on experience in data engineering or backend development with MongoDB. Demonstrated experience with self-hosted MongoDB, including cluster setup, maintenance, and troubleshooting. . Technical Competencies: Deep hands-on experience with MongoDB data modelling , schema design, and normalization/denormalization strategies. Strong proficiency in MongoDB development : aggregation pipelines, CRUD, performance tuning, and index management. Experience in building reporting and analytics using MongoDB Reporting suite. Experience with self-hosted MongoDB deployments (e.g., sharding, replication, monitoring, security configuration). Working knowledge of Azure cloud services (Azure Cosmos DB, VMs, App Service, networking for secure deployments). (Nice to have) Experience in Python for backend integration, data processing, or scripting
Posted 1 week ago
3.0 years
6 - 10 Lacs
Gurgaon
On-site
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Positions in this function are responsible for the management and manipulation of mostly structured data, with a focus on building business intelligence tools, conducting analysis to distinguish patterns and recognize trends, performing normalization operations and assuring data quality. Depending on the specific role and business line, example responsibilities in this function could include creating specifications to bring data into a common structure, creating product specifications and models, developing data solutions to support analyses, performing analysis, interpreting results, developing actionable insights and presenting recommendations for use across the company. Roles in this function could partner with stakeholders to understand data requirements and develop tools and models such as segmentation, dashboards, data visualizations, decision aids and business case analysis to support the organization. Other roles involved could include producing and managing the delivery of activity and value analytics to external stakeholders and clients. Team members will typically use business intelligence, data visualization, query, analytic and statistical software to build solutions, perform analysis and interpret data. Positions in this function work on predominately descriptive and regression-based analytics and tend to leverage subject matter expert views in the design of their analytics and algorithms. This function is not intended for employees performing the following work: production of standard or self-service operational reporting, casual inference led (healthcare analytics) or data pattern recognition (data science) analysis; and/or image or unstructured data analysis using sophisticated theoretical frameworks. Primary Responsibilities: Analyze data and extract actionable insights, findings, and recommendations Develop data validation strategies to ensure accuracy and reliability of data Communicate data and findings effectively to internal and external senior executives with clear supporting evidence Relate analysis to the organization's overall business objectives Implement generative AI techniques to reduce manual efforts and automate processes Analyzes and investigates Provides explanations and interpretations within area of expertise Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: Undergraduate degree or equivalent experience Experience in creating summarized reports of findings and recommendations Proficiency in SQL language, Snowflake, and AI techniques Solid skills in Microsoft Excel, Word, PowerPoint, and Visio Ability to multitask, take initiative, and adapt to changing priorities Proven self-motivated team player with solid problem-solving and analytical skills Preferred Qualifications: 3+ years of work experience 2+ years of experience working with a healthcare consulting firm Experience in data analytics and hands-on experience in Python Programming, SQL, and Snowflake Proven creative, strategic thinker with excellent critical thinking skills At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.
Posted 1 week ago
3.0 - 6.0 years
0 Lacs
Noida
On-site
AI / ML Engineer – Multilingual Voice AI Systems Location: Noida- Onsite Experience: 3–6 Years Type: Full-Time/Contractual Reporting To: Solution Architect / Technical Director We’re building a multilingual, AI-driven voice assistant for phone-based booking via telephony APIs e.g. (Ozonetel / Exotel / Twilio / Knowlarity). The system must interact naturally in multiple global languages, making it a first-of-its-kind solution in our product stack. Role Overview We're seeking a practical AI/ML Engineer to lead the speech AI and multilingual NLP components of a voice automation platform. You’ll design and integrate Speech-to-Text, LLM/NLU, and Text-to-Speech systems that support dynamic phone conversations in multiple languages. Key Responsibilities Design, train, or integrate multilingual NLP pipelines to support dynamic conversations using: LLMs (OpenAI GPT, LLaMA 3, Ollama, or Azure OpenAI) STT/TTS services (Google Cloud, Azure Speech, Whisper, etc.) Configure language detection, intent recognition, and fallback flows Create multilingual prompt engineering logic and fine-tune models (if needed) Ensure STT/LLM/TTS works effectively with real-time voice streams via telephony APIs e.g. (Ozonetel / Exotel / Twilio / Knowlarity) Support integration of fallback IVR-style flows when AI fails Work closely with .NET engineers and backend team to wire up API-based workflows Assist in evaluating trade-offs between cloud APIs vs self-hosted models Optimize voice latency, transcription quality, and conversational UX Help set up language-specific voice personas with TTS (tone, pitch, accent) Must-Have Skills Experience building or integrating conversational AI or voicebots Strong knowledge of NLP, multilingual text handling, and LLM usage Hands-on integration with: OpenAI / Hugging Face / Ollama / LangChain Google Speech / Azure STT & TTS / Whisper Good understanding of international language encoding, accents, and noise handling Python fluency (NLTK, transformers, speech SDKs, etc.) Ability to work with APIs (REST, JSON, async jobs) Good-to-Have Skills Knowledge of telephony APIs (Ozonetel / Exotel / Twilio / Knowlarity) Familiarity with audio pipelines (e.g., audio capture, normalization, latency tuning) Experience with language-specific nuances like: Arabic TTS tokenization Hindi/Urdu word splitting Nigerian Pidgin vs standard English handling Prior experience with voice UX design or multilingual chatbot building Familiarity with Whisper fine-tuning or custom STT model training Qualifications Bachelor’s / Master’s in CS, Data Science, AI, or equivalent 3–6 years in AI/ML product development with focus on NLP, STT, or TTS Past experience in multilingual bot or voice system is highly preferred Job Types: Full-time, Contractual / Temporary, Freelance Contract length: 3 months Work Location: In person
Posted 1 week ago
0 years
4 - 6 Lacs
Noida
On-site
Posted On: 24 Jul 2025 Location: Noida, UP, India Company: Iris Software Why Join Us? Are you inspired to grow your career at one of India’s Top 25 Best Workplaces in IT industry? Do you want to do the best work of your life at one of the fastest growing IT services companies ? Do you aspire to thrive in an award-winning work culture that values your talent and career aspirations ? It’s happening right here at Iris Software. About Iris Software At Iris Software, our vision is to be our client’s most trusted technology partner, and the first choice for the industry’s top professionals to realize their full potential. With over 4,300 associates across India, U.S.A, and Canada, we help our enterprise clients thrive with technology-enabled transformation across financial services, healthcare, transportation & logistics, and professional services. Our work covers complex, mission-critical applications with the latest technologies, such as high-value complex Application & Product Engineering, Data & Analytics, Cloud, DevOps, Data & MLOps, Quality Engineering, and Business Automation. Working at Iris Be valued, be inspired, be your best. At Iris Software, we invest in and create a culture where colleagues feel valued, can explore their potential, and have opportunities to grow. Our employee value proposition (EVP) is about “Being Your Best” – as a professional and person. It is about being challenged by work that inspires us, being empowered to excel and grow in your career, and being part of a culture where talent is valued. We’re a place where everyone can discover and be their best version. Job Description Ability to write complex queries (outer joins, inline queries, sub queries, Analytical queries, Hierarchical queries) Knowledge of normalization, schema design, and data modeling. Experience in Stored procedures and functions. Exception handling in Stored procedure. Transaction control and batch handling in SP for large data sets. Skills in indexing, query optimization, and troubleshooting performance issues. Understanding of encryption, access control, and security best practices. Knowledge of backup strategies, disaster recovery, and data restoration. Experience with Extract, Transform, Load (ETL) processes and tools. Experience with Extract, Transform, Load (ETL) processes and tools. Familiarity with cloud-based database services (e.g., AWS RDS, Azure SQL Database) Ability to diagnose and resolve database-related issues. Clear and effective communication with team members and stakeholders. Working effectively within a team environment. Ability to learn new technologies and adapt to changing requirements. Mandatory Competencies Database - Database Programming - SQL Database - Oracle - Data Modelling Database - Sql Server - DBA Database - Sql Server - SQL Packages Beh - Communication Perks and Benefits for Irisians At Iris Software, we offer world-class benefits designed to support the financial, health and well-being needs of our associates to help achieve harmony between their professional and personal growth. From comprehensive health insurance and competitive salaries to flexible work arrangements and ongoing learning opportunities, we're committed to providing a supportive and rewarding work environment. Join us and experience the difference of working at a company that values its employees' success and happiness.
Posted 1 week ago
0 years
1 - 2 Lacs
India
On-site
1. A skilled and enthusiastic Database Developer (Fresher or Experienced) is needed to join the technology team at Anantdv. 2. The role involves designing, developing, and maintaining database systems to support applications and processes. Responsibilities 1. Key responsibilities include designing efficient database structures, writing and optimizing SQL queries, and ensuring data integrity and security. 2. Additional responsibilities include collaborating with developers, performing maintenance, troubleshooting issues, and documenting processes. 3. The role also involves assisting with data migration and supporting data analytics. Qualifications 1. Bachelor's degree in a relevant field. 2. For Freshers: A basic understanding of SQL, relational databases (e.g., MySQL, PostgreSQL, Oracle, or SQL Server), and database design principles are required. 3. For Experienced Candidates: Proven experience as a Database Developer with a portfolio of projects is required. 4. Proficiency in SQL and experience with DBMS like MySQL, Oracle, or SQL Server are required. 5. Knowledge of database design, normalization, and data modeling is required. 6. Familiarity with data warehousing and ETL is required. 7. An understanding of database security and compliance is required. 8. Strong analytical, problem-solving, and communication skills are required. Job Type: Full-time Pay: ₹10,040.70 - ₹20,000.00 per month Benefits: Cell phone reimbursement Paid sick time Schedule: Day shift Supplemental Pay: Performance bonus Yearly bonus Work Location: In person
Posted 1 week ago
3.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Company Description Yoeki Soft Private Limited is committed to becoming one of the most innovative and agile business solution providers. Our mission is to empower a better, innovative, digitized, and smart world through every service we offer. We are proud to collaborate with both renowned brands and emerging startups, sharing common values and desires to drive digital innovation and agility. Based in Noida, we support our clients in staying relevant and achieving growth amidst rapid technological changes. Role Description This is a full-time on-site role for an SQL Developer located in Noida. The SQL Developer will be responsible for designing, developing, and maintaining database systems. Key tasks include developing and optimizing SQL queries, implementing ETL processes, data modeling, and ensuring database performance and efficiency. The role also involves analyzing and troubleshooting database-related issues and collaborating with other teams to support project requirements. Technical Skills and Qualifications: Education: Bachelor’s degree in Computer Science, Information Technology, or a related field. 3+ years of experience as an SQL Developer or similar role. Proven experience with Microsoft SQL Server 2017, 2019, and 2022. Strong knowledge of T-SQL, SSIS (SQL Server Integration Services), and SSRS (SQL Server Reporting Services). Experience with database performance tuning and optimization. Familiarity with ETL tools and processes. Experience in the FinTech industry or working with financial data is a plus. Technical Proficiency: Advanced knowledge of SQL Server Management Studio (SSMS). Experience with database design, normalization, and indexing. Knowledge of data warehousing concepts and practices. Familiarity with cloud-based SQL solutions (e.g., Azure SQL Database) is a plus. 📍Location-Noida 63 Mode-Work from office 📌Note- This Job description is a general guide and may be subject to change based on specific needs and requirements of the organization. Please share cv on- naina.c@yoekisoft.in
Posted 1 week ago
15.0 years
0 Lacs
Pune, Maharashtra, India
On-site
HCL Software (hcl-software.com) delivers software that fulfils the transformative needs of clients around the world. We build award winning software across AI, Automation, Data & Analytics, Security and Cloud. The HCL Unica+ Marketing Platform enables our customers to deliver precision and high performance Marketing campaigns across multiple channels like Social Media, AdTech Platforms, Mobile Applications, Websites, etc. The Unica+ Marketing Platform is a Data and AI first platform that enables our clients to deliver hyper-personalized offers and messages for customer acquisition, product awareness and retention. We are seeking a Senior Architect Developer with strong Data Science and Machine Learning skills and experience to deliver AI driven Marketing Campaigns. Responsibilities Designing and Architecting End-to-End AI/ML Solutions for Marketing: The architect is responsible for designing robust, scalable, and secure AI/ML solutions specifically tailored for marketing challenges. This includes defining data pipelines, selecting appropriate machine learning algorithms and frameworks (e.g., for predictive analytics, customer segmentation, personalization, campaign optimization, sentiment analysis), designing model deployment strategies, and integrating these solutions seamlessly with existing marketing tech stacks and enterprise systems. They must consider the entire lifecycle from data ingestion to model monitoring and retraining. Technical Leadership: The AI/ML architect acts as a technical leader, providing guidance and mentorship to data scientists, ML engineers, and other development teams. They evaluate and select the most suitable AI/ML tools, platforms, and cloud services (AWS, GCP, Azure) for marketing use cases. The architect is aso responsible for establishing and promoting best practices for MLOps (Machine Learning Operations), model versioning, continuous integration/continuous deployment (CI/CD) for ML models, and ensuring data quality, ethical AI principles (e.g., bias, fairness), and regulatory compliance (e.g., data privacy laws). Python Programming & Libraries: Proficient in Python with extensive experience using Pandas for data manipulation, NumPy for numerical operations, and Matplotlib/Seaborn for data visualization. Statistical Analysis & Modelling: Strong understanding of statistical concepts, including descriptive statistics, inferential statistics, hypothesis testing, regression analysis, and time series analysis. Data Cleaning & Preprocessing: Expertise in handling messy real-world data, including dealing with missing values, outliers, data normalization/standardization, feature engineering, and data transformation. SQL & Database Management: Ability to query and manage data efficiently from relational databases using SQL, and ideally some familiarity with NoSQL databases. Exploratory Data Analysis (EDA): Skill in visually and numerically exploring datasets to understand their characteristics, identify patterns, anomalies, and relationships. Machine Learning Algorithms: In-depth knowledge and practical experience with a wide range of ML algorithms such as linear models, tree-based models (Random Forests, Gradient Boosting), SVMs, K-means, and dimensionality reduction techniques (PCA). Deep Learning Frameworks: Proficiency with at least one major deep learning framework like TensorFlow or PyTorch. This includes understanding neural network architectures (CNNs, RNNs, Transformers) and their application to various problems. Model Evaluation & Optimization: Ability to select appropriate evaluation metrics (e.g., precision, recall, F1-score, AUC-ROC, RMSE) for different problem types, diagnose model performance issues (bias-variance trade-off), and apply optimization techniques. Deployment & MLOps Concepts: Deploy machine learning models into production environments, including concepts of API creation, containerization (Docker), version control for models, and monitoring. Qualifications & Skills At least 15+ years of Experience across Data Architecture, Data Science and Machine Learning. Experience in delivering AI/ML models for Marketing Outcomes like Customer Acquisition, Customer Churn, Next Best Product or Offer. This is a mandatory requirement. Experience with Customer Data Platforms (CDP) and Marketing Platforms like Unica, Adobe, SalesForce, Braze, TreasureData, Epsilon, Tealium is mandatory. Experience with AWS SageMaker is advantageous Experience with LangChain, RAG for Generative AI is advantageous. Experience with ETL process and tools like Apache Airflow is advantageous Expertise in Integration tools and frameworks like Postman, Swagger, API Gateways Ability to work well within an agile team environment and apply the related working methods. Excellent communication & interpersonal skills A 4-year degree in Computer Science or IT is a must. Travel: 30% +/- travel required
Posted 1 week ago
8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
HCL Software (hcl-software.com) delivers software that fulfils the transformative needs of clients around the world. We build award winning software across AI, Automation, Data & Analytics, Security and Cloud. The HCL Unica+ Marketing Platform enables our customers to deliver precision and high performance Marketing campaigns across multiple channels like Social Media, AdTech Platforms, Mobile Applications, Websites, etc. The Unica+ Marketing Platform is a Data and AI first platform that enables our clients to deliver hyper-personalized offers and messages for customer acquisition, product awareness and retention. We are seeking a Senior Python Developer with strong Data Science and Machine Learning skills and experience to deliver AI driven Marketing Campaigns. Responsibilities Python Programming & Libraries: Proficient in Python with extensive experience using Pandas for data manipulation, NumPy for numerical operations, and Matplotlib/Seaborn for data visualization. Statistical Analysis & Modelling: Strong understanding of statistical concepts, including descriptive statistics, inferential statistics, hypothesis testing, regression analysis, and time series analysis. Data Cleaning & Preprocessing: Expertise in handling messy real-world data, including dealing with missing values, outliers, data normalization/standardization, feature engineering, and data transformation. SQL & Database Management: Ability to query and manage data efficiently from relational databases using SQL, and ideally some familiarity with NoSQL databases. Exploratory Data Analysis (EDA): Skill in visually and numerically exploring datasets to understand their characteristics, identify patterns, anomalies, and relationships. Machine Learning Algorithms: In-depth knowledge and practical experience with a wide range of ML algorithms such as linear models, tree-based models (Random Forests, Gradient Boosting), SVMs, K-means, and dimensionality reduction techniques (PCA). Deep Learning Frameworks: Proficiency with at least one major deep learning framework like TensorFlow or PyTorch. This includes understanding neural network architectures (CNNs, RNNs, Transformers) and their application to various problems. Model Evaluation & Optimization: Ability to select appropriate evaluation metrics (e.g., precision, recall, F1-score, AUC-ROC, RMSE) for different problem types, diagnose model performance issues (bias-variance trade-off), and apply optimization techniques. Deployment & MLOps Concepts: Understanding of how to deploy machine learning models into production environments, including concepts of API creation, containerization (Docker), version control for models, and monitoring. Qualifications & Skills At least 8-10 years. of Python Development Experience with at least 4 years in data science and machine learning Experience with Customer Data Platforms (CDP) like TreasureData, Epsilon, Tealium, Adobe, Salesforce is advantageous. Experience with AWS SageMaker is advantegous Experience with LangChain, RAG for Generative AI is advantageous. Expertise in Integration tools and frameworks like Postman, Swagger, API Gateways Knowledge of REST, JSON, XML, SOAP is a must Ability to work well within an agile team environment and applying the related working methods. Excellent communication & interpersonal skills A 4-year degree in Computer Science or IT is a must. Travel: 30% +/- travel required
Posted 1 week ago
6.0 years
0 Lacs
India
Remote
Job Title : Data Modeler Location : Remote Experience : 6+ years Mode : 6 month contract + ext. Key Responsibilities : - Design and maintain Conceptual, Logical, and Physical Data Models aligned with business requirements and technical specifications. - Build efficient Star and Snowflake schemas for analytical and reporting use cases. - Apply Normalization and Denormalization techniques to optimize data structures for various workloads. Design and manage Snowflake schemas and objects, including : - Secure and Materialized Views - Streams and Tasks - Time Travel and Cloning - Performance tuning strategies - Write and optimize complex SQL queries using Window Functions, Common Table Expressions (CTEs), and other advanced features. - Automate and optimize data transformation pipelines and data model deployments using scripting or orchestration tools. - Collaborate with data engineers, BI developers, and business stakeholders to understand data needs and translate them into scalable models. Required Skills and Qualifications : - 6+ years of experience in data modeling across large data environments. - Proven expertise in conceptual, logical, and physical modeling techniques. - Strong knowledge of Snowflake architecture and features. - Deep understanding of SQL and query performance optimization. - Experience with automation and scripting for data workflows. - Strong analytical and communication skills. - Familiarity with data governance, security, and compliance best practices is a plus. Preferred Qualifications : - Experience with data modeling tools (e.g., ER/Studio, Erwin, dbt, SQL DBM). - Exposure to cloud data platforms like AWS, Azure, or GCP. - Knowledge of CI/CD for data pipeline deployments.
Posted 1 week ago
7.5 years
0 Lacs
Jaipur, Rajasthan, India
On-site
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP ABAP Development for HANA Good to have skills : NA Minimum 7.5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will be responsible for managing the team and ensuring successful project delivery. Your typical day will involve collaborating with multiple teams, making key decisions, and providing solutions to problems for your immediate team and across multiple teams. Roles & Responsibilities: - Expected to be an SME - Collaborate and manage the team to perform - Responsible for team decisions - Engage with multiple teams and contribute on key decisions - Provide solutions to problems for their immediate team and across multiple teams - Lead the effort to design, build, and configure applications - Act as the primary point of contact - Manage the team and ensure successful project delivery Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP ABAP Development for HANA S4 HANA Implementation knowledge Life sciences experience - Strong understanding of statistical analysis and machine learning algorithms - Experience with data visualization tools such as Tableau or Power BI - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information: - The candidate should have a minimum of 7.5 years of experience in SAP ABAP Development for HANA - This position is based at our Hyderabad office - A 15 years full-time education is required
Posted 1 week ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Summary Position Summary Role Responsibilities: You will partner with business and IT operations stakeholders to define, prioritize, and execute ServiceNow CMDB operational needs and enhancements to successfully deliver CMDB capabilities which includes CMDB administration, Discovery. Additional responsibilities include: Build and develop ServiceNow Discovery and Service Mapping patterns to support business and IT Administers operational support of ServiceNow ITOM to ensure it is run and maintained in keeping with industry standards and best practices and meet the business and IT needs. Establish, review and optimize new/existing processes to improve customer experience and overall delivery of business outcomes. Forge strong, collaborative relationships and build consensus among competing stakeholders across business and IT peers, leveraging exceptional communication and interpersonal skills. Required Education, Qualifications, and Experience Settings : Expertise and experience with ServiceNow CMDB product is required ServiceNow experience with discovery, Broad general knowledge of IT infrastructure topology including typical application, server & networking configurations and how they can be documented using a common data model Good Knowledge on IT Service Management Practices. Job Description: 3+ yrs Experience with various ServiceNow modules, including Normalization Data Services, ITOM Discovery, Configuration Management Lifecycle, Common Services Data Model, Service Mapping. Understanding of ServiceNow CMDB, Identification and Reconciliation Engine, and workflow capabilities. Familiarity with platform security and integration best practices. Create, maintain, and enhance future integrations between ServiceNow and other systems. Experience in implementing Discovery and Service Mapping solutions. Ability to perform assessments and provide standard processes for Discovery and Service Mapping. Understanding of Discovery scheduling and potential network impacts. Experience with Configuration Item blueprint and CMDB Health dashboard configuration, including remediation of duplicate and stale CI items. Expert understanding of import sets, transform maps, and ServiceNow data mapping. Knowledge of ServiceNow CMDB class hierarchy and its relation to Asset and Configuration Management. Fundamental knowledge of Common Services Data Model (CSDM). Experience with additional product lines on the ServiceNow platform, such as ITSM, ITBM, and/or ITAM. QUALIFICATIONS EXPERIENCE & EDUCATION : Experience with the ServiceNow CMDB and Discovery. Detailed understanding of and experience with, ITIL processes Excellent oral, written, presentation, and communication skills Ability to effectively prioritize and execute in a high-pressure environment Ability to independently set priorities and meet deadlines in a fast- paced environment, a self-starter Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 307566
Posted 1 week ago
0.0 - 8.0 years
0 Lacs
Gurugram, Haryana
On-site
202505104 Gurugram, Haryana, India Bevorzugt Description Job Responsibility: Design, develop, and optimize MongoDB data models for various business and analytics use cases. Implement and maintain efficient MongoDB CRUD operations, indexes, and schema evolution strategies. Experience with self-hosted MongoDB deployments, including installation, configuration, scaling, backup/restore, and monitoring. Build and maintain reporting and analytics pipelines using MongoDB Reporting suite. Develop, monitor, and tune MongoDB (both self-hosted and cloud-managed) deployments for scalability, reliability, and security. Collaborate with engineering and product teams to translate requirements into MongoDB-backed solutions. Support integration with Azure cloud services (e.g., Azure Cosmos DB for MongoDB, Azure Functions, Blob Storage). Maintain documentation and contribute to database standards and best practices. (Nice to have) Support data ingestion and automation tasks using Python. Qualifications Qualifications: Bachelor’s or master’s in computer science, Engineering, or related quantitative discipline. Experience: 5 to 8 years of hands-on experience in data engineering or backend development with MongoDB. Demonstrated experience with self-hosted MongoDB, including cluster setup, maintenance, and troubleshooting. Technical Competencies: Deep hands-on experience with MongoDB data modelling , schema design, and normalization/denormalization strategies. Strong proficiency in MongoDB development : aggregation pipelines, CRUD, performance tuning, and index management. Experience in building reporting and analytics using MongoDB Reporting suite. Experience with self-hosted MongoDB deployments (e.g., sharding, replication, monitoring, security configuration). Working knowledge of Azure cloud services (Azure Cosmos DB, VMs, App Service, networking for secure deployments). (Nice to have) Experience in Python for backend integration, data processing, or scripting
Posted 1 week ago
3.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Positions in this function are responsible for the management and manipulation of mostly structured data, with a focus on building business intelligence tools, conducting analysis to distinguish patterns and recognize trends, performing normalization operations and assuring data quality. Depending on the specific role and business line, example responsibilities in this function could include creating specifications to bring data into a common structure, creating product specifications and models, developing data solutions to support analyses, performing analysis, interpreting results, developing actionable insights and presenting recommendations for use across the company. Roles in this function could partner with stakeholders to understand data requirements and develop tools and models such as segmentation, dashboards, data visualizations, decision aids and business case analysis to support the organization. Other roles involved could include producing and managing the delivery of activity and value analytics to external stakeholders and clients. Team members will typically use business intelligence, data visualization, query, analytic and statistical software to build solutions, perform analysis and interpret data. Positions in this function work on predominately descriptive and regression-based analytics and tend to leverage subject matter expert views in the design of their analytics and algorithms. This function is not intended for employees performing the following work: production of standard or self-service operational reporting, casual inference led (healthcare analytics) or data pattern recognition (data science) analysis; and/or image or unstructured data analysis using sophisticated theoretical frameworks. Primary Responsibilities Analyze data and extract actionable insights, findings, and recommendations Develop data validation strategies to ensure accuracy and reliability of data Communicate data and findings effectively to internal and external senior executives with clear supporting evidence Relate analysis to the organization's overall business objectives Implement generative AI techniques to reduce manual efforts and automate processes Analyzes and investigates Provides explanations and interpretations within area of expertise Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Undergraduate degree or equivalent experience Experience in creating summarized reports of findings and recommendations Proficiency in SQL language, Snowflake, and AI techniques Solid skills in Microsoft Excel, Word, PowerPoint, and Visio Ability to multitask, take initiative, and adapt to changing priorities Proven self-motivated team player with solid problem-solving and analytical skills Preferred Qualifications 3+ years of work experience 2+ years of experience working with a healthcare consulting firm Experience in data analytics and hands-on experience in Python Programming, SQL, and Snowflake Proven creative, strategic thinker with excellent critical thinking skills At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.
Posted 1 week ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Position Overview Job Title: SQL Developer Location: Pune, India Corporate Title: AS Role Description This role is specifically within the Legal CIO sub-domain for the DExTR programme. The DExTR is an Enterprise Contract Management system to support the legal department in negotiating, drafting, managing and storing their body of contracts. The project objective is to Streamline the creation and amendments of legal documents and Automate contract generation, enrich metadata and enforce data quality through simple Document Assembly technology and Integrate negotiation workflow, document assembly and document automation with standard desktop applications (MS Word, MS Outlook) to enhance the user experience, and streamline user adoption of the application. Deutsche Bank is a client-centric global universal bank. One that is leading change and innovation in the industry – championing integrity, sustainable performance and innovation with our clients, and redefining our culture and relationships with each other. The CIO Chief Administrative Office (CAO) function brings together the IT services for the Group CAO functions, Human Resources, Legal, and Corporate Communications & CSR. Legal is the Principal Manager of legal risk of the Deutsche Bank Group and guardian of Deutsche Bank Group’s culture, integrity and reputation. DB’s Legal Department is fully independent from the Business Divisions and has a direct reporting line into the Management Board and not into any Business Division. The Legal CIO department has a broad change portfolio, that is in some cases regulatory driven and therefore visible to Board Level. The Legal department has been undergoing significant business and technology transformation in recent years, covering critical aspects of the departments; Risk Advisory, Litigation, and COO. A range of technology change initiatives are now running, that cover critical topics; Legal Document Management, Reference Data Distribution, Enterprise Legal Management, Spend Analytics, Global Governance, and Contract Management. What We’ll Offer You As part of our flexible scheme, here are just some of the benefits that you’ll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your Key Responsibilities Design, Develop and optimize complex SQL queries, procedures and views for data extraction and transformation. Develop and maintain dashboards and reports to track key performance metrics using advanced analytical tools like Tableau and SAP BO and generate actionable insights. Collaborate with business analysts and stakeholders to understand reporting requirements and translate them into technical solutions. Build and manage Query pipelines and ETL processes as needed for Tableau and SAP BO data sources. Perform Data Validation and quality assurance on SQL Outputs and Visualizations. Troubleshoot and resolve issues related to data inconsistencies, data quality or performance. Support ad-hoc reporting requests and provide documentation to end users. Your Skills And Experience Strong analytical, problem-solving and communication skills. Extensive experience in SQL development, relational databases (Oracle) and data distribution techniques. Strong understanding of data modelling, normalization and query performance tuning. Hands-on experience in creating data visualization and reports in Tableau and SAP BO. Knowledge on Cloud technologies preferably GCP. Good Knowledge on Java Programming is a plus. Working with Agile methodologies such as Scrum and Kanban. Working experience on version controlling tools like GIT, VCS and cloud Code repository like GitHub/Bitbucket. How We’ll Support You Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs About Us And Our Teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Hi, We are looking for an experienced Data Engineer to join our IT team. The ideal candidate will have at least 4-8 years of hands-on experience . Kindly share your profile at career@mezash.com, if the below J.D. is matching with your profile. Note : This is an urgent hiring, share your profile if you are an immediate joiner only(* Apply only who can join in 5-10 Days). ## Shift Timings: General Shift Time ## Data Engineer ##Office: Noida ## Working Mode: WORK FROM OFFICE (100%) ##Job Title: Data Engineer (Python Expert) ##Location: NOIDA ##Experience: 4-8Years Job Description: We are seeking an experienced and highly skilled Data Engineer to join our dynamic team. The ideal candidate will have a strong background in building scalable data pipelines, analyzing complex datasets, and delivering actionable insights through visualizations and automated workflows. Key Responsibilities: Design and develop robust ETL pipelines to extract, transform, and load data from various sources. Write efficient Python scripts to parse structured and unstructured data from APIs, databases, and flat files. Build and maintain data models that feed into interactive Power BI dashboards and reports . Automate data workflows and reporting processes to improve efficiency and accuracy. Conduct exploratory data analysis (EDA) and mining to derive business insights. Collaborate with cross-functional teams including analysts, engineers, and business stakeholders to define data requirements. Ensure data quality, consistency, and compliance with organizational and industry standards. Required Skills and Qualifications: 4–8 years of professional experience as a Data Scientist or Data Engineer . Advanced proficiency in Python for data manipulation (e.g., Pandas, NumPy ). Strong experience in building/managing ETL pipelines using tools like Airflow , Azure Data Factory , or custom Python scripts. Hands-on expertise in Power BI , including dataset creation, dashboard design, and DAX expressions . Solid knowledge of SQL and relational databases such as PostgreSQL, MySQL, or MSSQL . Familiarity with working with REST APIs , JSON , and data normalization techniques. Experience with cloud platforms (Azure, AWS, GCP) is a plus.
Posted 1 week ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
We are seeking a highly experienced Voice AI /ML Engineer to lead the design and deployment of real-time voice intelligence systems . This role focuses on ASR , TTS , speaker diarization , wake word detection , and building production-grade modular audio processing pipelines to power next-generation contact centre solutions , intelligent voice agents , and telecom-grade audio systems . You will work at the intersection of deep learning , streaming infrastructure , and speech/NLP technology , creating scalable, low-latency systems across diverse audio formats and real-world applications. Key Responsibilities: Voice & Audio Intelligence: Build, fine-tune, and deploy ASR models (e.g., Whisper , wav2vec2.0 , Conformer ) for real-time transcription. Develop and finetune high-quality TTS systems using VITS , Tacotron , FastSpeech for lifelike voice generation and cloning. Implement speaker diarization for segmenting and identifying speakers in multi-party conversations using embeddings (x-vectors/d-vectors) and clustering (AHC, VBx, spectral clustering). Design robust wake word detection models with ultra-low latency and high accuracy in noisy conditions. Real-Time Audio Streaming & Voice Agent Infrastructure: Architect bi-directional real-time audio streaming pipelines using WebSocket , gRPC , Twilio Media Streams , or WebRTC . Integrate voice AI models into live voice agent solutions , IVR automation , and AI contact center platforms . Optimize for latency , concurrency , and continuous audio streaming with context buffering and voice activity detection (VAD). Build scalable microservices to process, decode, encode, and stream audio across common codecs (e.g., PCM , Opus , μ-law , AAC , MP3 ) and containers (e.g., WAV , MP4 ). Deep Learning & NLP Architecture: Utilize transformers , encoder-decoder models , GANs , VAEs , and diffusion models , for speech and language tasks. Implement end-to-end pipelines including text normalization, G2P mapping, NLP intent extraction, and emotion/prosody control. Fine-tune pre-trained language models for integration with voice-based user interfaces. Modular System Development: Build reusable, plug-and-play modules for ASR , TTS , diarization , codecs , streaming inference , and data augmentation . Design APIs and interfaces for orchestrating voice tasks across multi-stage pipelines with format conversions and buffering. Develop performance benchmarks and optimize for CPU/GPU, memory footprint, and real-time constraints. Engineering & Deployment: Writing robust, modular, and efficient Python code Experience with Docker , Kubernetes , cloud deployment (AWS, Azure, GCP) Optimize models for real-time inference using ONNX , TorchScript , and CUDA , including quantization , context-aware inference , model caching . On device voice model deployment. Why join us? Impactful Work: Play a pivotal role in safeguarding Tanla's assets, data, and reputation in the industry. Tremendous Growth Opportunities: Be part of a rapidly growing company in the telecom and CPaaS space, with opportunities for professional development. Innovative Environment: Work alongside a world-class team in a challenging and fun environment, where innovation is celebrated. Tanla is an equal opportunity employer. We champion diversity and are committed to creating an inclusive environment for all employees. www.tanla.com
Posted 1 week ago
0 years
0 Lacs
Bengaluru East, Karnataka, India
On-site
Data Engineer- ETL Bangalore, India AXA XL recognizes data and information as critical business assets, both in terms of managing risk and enabling new business opportunities. This data should not only be high quality, but also actionable - enabling AXA XL’s executive leadership team to maximize benefits and facilitate sustained industrious advantage. Our Chief Data Office also known as our Innovation, Data Intelligence & Analytics team (IDA) is focused on driving innovation through optimizing how we leverage data to drive strategy and create a new business model - disrupting the insurance market. As we develop an enterprise-wide data and digital strategy that moves us toward greater focus on the use of data and data-driven insights, we are seeking a Data Engineer. The role will support the team’s efforts towards creating, enhancing, and stabilizing the Enterprise data lake through the development of the data pipelines. This role requires a person who is a team player and can work well with team members from other disciplines to deliver data in an efficient and strategic manner. What You’ll Be Doing What will your essential responsibilities include? Act as a data engineering expert and partner to Global Technology and data consumers in controlling complexity and cost of the data platform, whilst enabling performance, governance, and maintainability of the estate. Understand current and future data consumption patterns, architecture (granular level), partner with Architects to make sure optimal design of data layers. Apply best practices in Data architecture. For example, balance between materialization and virtualization, optimal level of de-normalization, caching and partitioning strategies, choice of storage and querying technology, performance tuning. Leading and hands-on execution of research into new technologies. Formulating frameworks for assessment of new technology vs business benefit, implications for data consumers. Act as a best practice expert, blueprint creator of ways of working such as testing, logging, CI/CD, observability, release, enabling rapid growth in data inventory and utilization of Data Science Platform. Design prototypes and work in a fast-paced iterative solution delivery model. Design, Develop and maintain ETL pipelines using Py spark in Azure Databricks using delta tables. Use Harness for deployment pipeline. Monitor Performance of ETL Jobs, resolve any issue that arose and improve the performance metrics as needed. Diagnose system performance issue related to data processing and implement solution to address them. Collaborate with other teams to make sure successful integration of data pipelines into larger system architecture requirement. Maintain integrity and quality across all pipelines and environments. Understand and follow secure coding practice to make sure code is not vulnerable. You will report to the Application Manager. What You Will BRING We’re looking for someone who has these abilities and skills: Required Skills And Abilities Effective Communication skills. Bachelor’s degree in computer science, Mathematics, Statistics, Finance, related technical field, or equivalent work experience. Relevant years of extensive work experience in various data engineering & modeling techniques (relational, data warehouse, semi-structured, etc.), application development, advanced data querying skills. Relevant years of programming experience using Databricks. Relevant years of experience using Microsoft Azure suite of products (ADF, synapse and ADLS). Solid knowledge on network and firewall concepts. Solid experience writing, optimizing and analyzing SQL. Relevant years of experience with Python. Ability to break complex data requirements and architect solutions into achievable targets. Robust familiarity with Software Development Life Cycle (SDLC) processes and workflow, especially Agile. Experience using Harness. Technical lead responsible for both individual and team deliveries. Desired Skills And Abilities Worked in big data migration projects. Worked on performance tuning both at database and big data platforms. Ability to interpret complex data requirements and architect solutions. Distinctive problem-solving and analytical skills combined with robust business acumen. Excellent basics on parquet files and delta files. Effective Knowledge of Azure cloud computing platform. Familiarity with Reporting software - Power BI is a plus. Familiarity with DBT is a plus. Passion for data and experience working within a data-driven organization. You care about what you do, and what we do. Who WE Are AXA XL, the P&C and specialty risk division of AXA, is known for solving complex risks. For mid-sized companies, multinationals and even some inspirational individuals we don’t just provide re/insurance, we reinvent it. How? By combining a comprehensive and efficient capital platform, data-driven insights, leading technology, and the best talent in an agile and inclusive workspace, empowered to deliver top client service across all our lines of business − property, casualty, professional, financial lines and specialty. With an innovative and flexible approach to risk solutions, we partner with those who move the world forward. Learn more at axaxl.com What we OFFER Inclusion AXA XL is committed to equal employment opportunity and will consider applicants regardless of gender, sexual orientation, age, ethnicity and origins, marital status, religion, disability, or any other protected characteristic. At AXA XL, we know that an inclusive culture and a diverse workforce enable business growth and are critical to our success. That’s why we have made a strategic commitment to attract, develop, advance and retain the most diverse workforce possible, and create an inclusive culture where everyone can bring their full selves to work and can reach their highest potential. It’s about helping one another — and our business — to move forward and succeed. Five Business Resource Groups focused on gender, LGBTQ+, ethnicity and origins, disability and inclusion with 20 Chapters around the globe Robust support for Flexible Working Arrangements Enhanced family friendly leave benefits Named to the Diversity Best Practices Index Signatory to the UK Women in Finance Charter Learn more at axaxl.com/about-us/inclusion-and-diversity. AXA XL is an Equal Opportunity Employer. Total Rewards AXA XL’s Reward program is designed to take care of what matters most to you, covering the full picture of your health, wellbeing, lifestyle and financial security. It provides dynamic compensation and personalized, inclusive benefits that evolve as you do. We’re committed to rewarding your contribution for the long term, so you can be your best self today and look forward to the future with confidence. Sustainability At AXA XL, Sustainability is integral to our business strategy. In an ever-changing world, AXA XL protects what matters most for our clients and communities. We know that sustainability is at the root of a more resilient future. Our 2023-26 Sustainability strategy, called “Roots of resilience”, focuses on protecting natural ecosystems, addressing climate change, and embedding sustainable practices across our operations. Our Pillars Valuing nature: How we impact nature affects how nature impacts us. Resilient ecosystems - the foundation of a sustainable planet and society - are essential to our future. We’re committed to protecting and restoring nature - from mangrove forests to the bees in our backyard - by increasing biodiversity awareness and inspiring clients and colleagues to put nature at the heart of their plans. Addressing climate change: The effects of a changing climate are far reaching and significant. Unpredictable weather, increasing temperatures, and rising sea levels cause both social inequalities and environmental disruption. We're building a net zero strategy, developing insurance products and services, and mobilizing to advance thought leadership and investment in societal-led solutions. Integrating ESG: All companies have a role to play in building a more resilient future. Incorporating ESG considerations into our internal processes and practices builds resilience from the roots of our business. We’re training our colleagues, engaging our external partners, and evolving our sustainability governance and reporting. AXA Hearts in Action: We have established volunteering and charitable giving programs to help colleagues support causes that matter most to them, known as AXA XL’s “Hearts in Action” programs. These include our Matching Gifts program, Volunteering Leave, and our annual volunteering day - the Global Day of Giving. For more information, please see axaxl.com/sustainability.
Posted 1 week ago
8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Role – Senior Python Developer (Data Science, AI/ML) HCL Software (hcl-software.com) delivers software that fulfils the transformative needs of clients around the world. We build award winning software across AI, Automation, Data & Analytics, Security and Cloud. The HCL Unica+ Marketing Platform enables our customers to deliver precision and high performance Marketing campaigns across multiple channels like Social Media, AdTech Platforms, Mobile Applications, Websites, etc. The Unica+ Marketing Platform is a Data and AI first platform that enables our clients to deliver hyper-personalized offers and messages for customer acquisition, product awareness and retention. We are seeking a Senior Python Developer with strong Data Science and Machine Learning skills and experience to deliver AI driven Marketing Campaigns. Responsibilities 1. Python Programming & Libraries: Proficient in Python with extensive experience using Pandas for data manipulation, NumPy for numerical operations, and Matplotlib/Seaborn for data visualization. 2. Statistical Analysis & Modelling: Strong understanding of statistical concepts, including descriptive statistics, inferential statistics, hypothesis testing, regression analysis, and time series analysis. 3. Data Cleaning & Preprocessing: Expertise in handling messy real-world data, including dealing with missing values, outliers, data normalization/standardization, feature engineering, and data transformation. 4. SQL & Database Management: Ability to query and manage data efficiently from relational databases using SQL, and ideally some familiarity with NoSQL databases. 5. Exploratory Data Analysis (EDA): Skill in visually and numerically exploring datasets to understand their characteristics, identify patterns, anomalies, and relationships. 6. Machine Learning Algorithms: In-depth knowledge and practical experience with a wide range of ML algorithms such as linear models, tree-based models (Random Forests, Gradient Boosting), SVMs, K-means, and dimensionality reduction techniques (PCA). 7. Deep Learning Frameworks: Proficiency with at least one major deep learning framework like TensorFlow or PyTorch. This includes understanding neural network architectures (CNNs, RNNs, Transformers) and their application to various problems. 8. Model Evaluation & Optimization: Ability to select appropriate evaluation metrics (e.g., precision, recall, F1-score, AUC-ROC, RMSE) for different problem types, diagnose model performance issues (bias-variance trade-off), and apply optimization techniques. 9. Deployment & MLOps Concepts: Understanding of how to deploy machine learning models into production environments, including concepts of API creation, containerization (Docker), version control for models, and monitoring. Qualifications & Skills 1. At least 8-10 years. of Python Development Experience with at least 4 years in data science and machine learning 2. Experience with Customer Data Platforms (CDP) like TreasureData, Epsilon, Tealium, Adobe, Salesforce is advantageous. 3. Experience with AWS SageMaker is advantegous 4. Experience with LangChain, RAG for Generative AI is advantageous. 5. Expertise in Integration tools and frameworks like Postman, Swagger, API Gateways 6. Knowledge of REST, JSON, XML, SOAP is a must 7. Ability to work well within an agile team environment and applying the related working methods. 8. Excellent communication & interpersonal skills 9. A 4-year degree in Computer Science or IT is a must. Travel: 30% +/- travel required Location: India (Pune preferred) Compensation: Base salary, plus bonus
Posted 1 week ago
4.0 years
6 - 13 Lacs
India
On-site
Key ResponsibilitiesGeneral Linguistic Tasks Support linguistic data collection and development of resources like lexicons and corpora Work with engineering teams to refine linguistic features and address language-specific challenges Provide linguistic analysis for model training, tuning, and error analysis ASR (Automatic Speech Recognition) Transcribe and annotate Gujarati audio with high accuracy Conduct quality checks on speech transcriptions and audio-text alignments Contribute to pronunciation lexicons and assist in acoustic model validation MT (Machine Translation) Translate text between Gujarati and English Review and post-edit machine-translated content for quality, accuracy, and fluency Develop and apply linguistic rules for improving translation output TTS (Text-to-Speech) Evaluate synthesized Gujarati speech for naturalness, clarity, and accuracy Perform text normalization and phonetic transcription for Gujarati content Assist in voice data validation and mapping phonemes to sounds Skills & Qualifications Bachelor's degree in Linguistics, Languages, or a related field Native or near-native fluency in Gujarati , with strong command of English 4–6 years of relevant experience in linguistics, NLP, or AI/ML language technologies Experience with tools for transcription, annotation, or translation is a plus Strong attention to detail and a passion for language technology Job Type: Permanent Pay: ₹600,000.00 - ₹1,300,000.00 per year Work Location: In person Speak with the employer +91 9877917759
Posted 1 week ago
0 years
0 Lacs
Pune/Pimpri-Chinchwad Area
On-site
Company Description Ubisoft is a leading developer and publisher of video games worldwide whose brand portfolio covers blockbusters such as Assassins Creed, Prince of Persia and Splinter Cell, as well as games for the whole family, from Imagine and Petz to Rayman Raving Rabbids. To continue building on its achievements for the future, Ubisoft is looking for new talent for its growing Indian studio in Pune! We favor diversity, creativity, drive and team spirit. Job Description Job Summary: Associate Director – QC HR will be responsible to develop & implement strategic HR plans & policies to support company’s mission & HR strategy. The profile will ensure smooth running of HR activities in its perimeter , aligned with business goals. The person will also be responsible to cocreate with management team and oversee and execute HR operating models , guide and coach managers , analyse team to ensure organizational efficiency & acta as a pillar of change management in a perimeter. Job Responsibilities: Performance Management Understand the business needs and various roles in the department in order to ensure the right goal setting for individuals on the floor. Be a core part of annual and mid- year performance review for the team; coach and discipline employees. Would be responsible for managing the performance evaluation and career progressions processes Would be managing the overall KPI and goal setting implementation and evaluation. Institutionalize the performance management framework within business lines and monitor completion in time line given, Trouble shoot on normalization, Monitor to ensure that promotions are in line with defined policies; Liaise with unit Managers to drive closure. Develop development plan for the employees in order to achieve the business objectives. Optimizing the process and flows between Dept, managers and tools. Come up with recommendations to drive process improvements in order to achieve high productivity each year. Talent Development Analyzing the training needs on the floor. Design and drive implementation of training programs in co-ordination line with Corporate T&D. In case of external training, identify training vendors; Select vendor basis content and cost in collaboration with the T&D team. Monitor the effectiveness of training proramme and measure the impact of the same on the business and performance of the individuals. Employee Engagement / Productivity Measurement Design & Drive the engagement activities & associated roadmap. Act as a bridge between management and employees and create a network to ensure that the values and culture of the studio are respected and encouraged Talent Acquisition Responsible to achieve the Annual recruitment plan of QC structure. Participate in forecasting of manpower requirements for the year and per month/quarter based on business needs, projected attrition and expected movements. Liaise with Unit Managers to ensure that manpower is in line with pyramid structure. Provide inputs into recruitment plan development including fresher to non-fresher mix, channels to be used etc. Interview candidates for positions and assess candidate fitment into role and organization; Review the weekly reports and seek inputs in case of major abnormalities; Resolve recruitment related escalated issues. Planning Prepare HR key imperatives for the year covering individual plans for relevant HR processes based on historical data, business requirements and priorities. Provide inputs on policy changes required within business line. Provide inputs for HR budget to Director-HR Track and review the scores across various metrics. Track adherence to budget and take corrective actions in case of deviations. Others Conducting the Orientation programme of new joiners. Completion of joining formalities and documentation. Ensuring the inputs of the employees in Organization’s HRIS on regular basis Maintaining various metrics and global reports to monitor HR KPIs such as Attrition, Availability, Level changes, Promotions, Contract Renewals, Exits etc. Hearing and resolving employee grievances and conducting the counseling sessions. Liaison with Group HR for central activities These responsibilities are not limitative and can be modified in order to reach the company’s goals and objectives as well as personal performance. Qualifications Experience as HRBP in technology or gaming companies, Should have strong business acumen, ability to consult on complex organizational challenges, and also perform hands-on in the event of stretch/stabilisation efforts. Proven experience in managing the HR function of mid-sized organization. Experience in handling end to end cycle of PMS for mid- sized organisation Minimum Bachelor degree in MBA – HR, qualification in labour laws and financial management will be additional advantage. Additional Information Ubisoft India is an equal opportunity employer and welcomes applications from all interested parties. The studio welcomes and encourages applications from people with disabilities. We thank you for your interest, however, only those candidates selected for an interview will be contacted. No agencies please.
Posted 1 week ago
6.0 years
0 Lacs
Pune, Maharashtra, India
On-site
SQL Lead - COE Location: Pune Experience: 6-10 Years We are looking to hire a Data Engineer with strong hands-on experience in SQL and PL/SQL. Required Skills & Abilities : 6+ Years experience in any Databases (MS-SQL /Oracle /Teradata/Netezza). 4+ Years of experience to manage team and client calls. Strong expertise in writing complex SQL queries, joins, subqueries, and analytical functions. Hands-on experience with stored procedures, functions, triggers, packages, and cursors. Understanding of database design principles, normalization, and partitioning Knowledge of ETL processes, data migration, and data transformation. Experience working with Oracle SQL Developer or other database tools. Ability to analyze requirements and translate them into efficient database solutions Familiarity with UNIX/Linux shell scripting for automation (preferred). Strong problem-solving and debugging skills. Good communication skills and ability to work in a collaborative environment. Key Responsibilities: Develop, optimize, and maintain PL/SQL stored procedures, functions, triggers, and packages. Write complex SQL queries, views, and indexes for data manipulation and reporting. Optimize SQL queries and database performance using indexing, partitioning, and query tuning techniques. Ensure data integrity and security by implementing constraints, validations, and best practices. Work with cross-functional teams to understand business requirements and design efficient database solutions. Troubleshoot database issues, debug PL/SQL code, and improve query performance. Implement ETL processes using SQL and PL/SQL. Perform database schema design, normalization, and optimization. Collaborate with DBA teams for database backup, recovery, and maintenance. Develop and maintain database documentation, coding standards, and best practices. Preferred Qualifications: Experience with cloud databases (GCP or any other cloud is a plus) is a plus. Exposure to big data technologies like Hadoop, Spark (optional)
Posted 1 week ago
6.0 years
0 Lacs
Delhi, India
Remote
Job Title: Senior Data Modeler Experience Required: 6+ Years Location: Remote Employment Type: Full-time / Contract (Remote) Domain: Data Engineering / Analytics / Data Warehousing --- Job Summary: We are seeking an experienced and detail-oriented Data Modeler with a strong background in conceptual, logical, and physical data modeling. The ideal candidate will have in-depth knowledge of Snowflake architecture, data modeling best practices (Star/Snowflake schema), and advanced SQL scripting. You will be responsible for designing robust, scalable data models and working closely with data engineers, analysts, and business stakeholders. --- Key Responsibilities: 1. Data Modeling: Design Conceptual, Logical, and Physical Data Models. Create and maintain Star and Snowflake Schemas for analytical reporting. Perform Normalization and Denormalization based on performance and reporting requirements. Work closely with business stakeholders to translate requirements into optimized data structures. Maintain data model documentation and data dictionary. 2. Snowflake Expertise: Design and implement Snowflake schemas with optimal partitioning and clustering strategies. Perform performance tuning for complex queries and storage optimization. Implement Time Travel, Streams, and Tasks for data recovery and pipeline automation. Manage and secure data using Secure Views and Materialized Views. Optimize usage of Virtual Warehouses and storage costs. 3. SQL & Scripting: Write and maintain Advanced SQL queries including: Common Table Expressions (CTEs) Window Functions Recursive queries Build automation scripts for data loading, transformation, and validation. Troubleshoot and optimize SQL queries for performance and accuracy. Support data migration and integration projects. --- Required Skills & Qualifications: 6+ years of experience in Data Modeling and Data Warehouse design. Proven experience with Snowflake platform (min. 2 years). Strong hands-on experience in Dimensional Modeling (Star/Snowflake schemas). Expert in SQL and scripting for automation and performance optimization. Familiarity with tools like Erwin, PowerDesigner, or similar data modeling tools. Experience working in Agile/Scrum environments. Strong analytical and problem-solving skills. Excellent communication and stakeholder engagement skills. --- Preferred Skills (Nice to Have): Experience with ETL/ELT tools like dbt, Informatica, Talend, etc. Exposure to Cloud Platforms like AWS, Azure, or GCP. Familiarity with Data Governance and Data Quality frameworks.
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As an Analyst specializing in MySQL Database Administration, you will be responsible for performing technical research, analysis, design, and architecting of IT Database systems. Your primary objective will be to ensure that the documented business and technical requirements are met and implemented to industry best practice standards. This includes driving short and long-term architecture strategy for IT Database systems, communicating and validating database architecture with technology services and development management teams, and leading technical design discussions with Development and project team members to ensure compliance with documented requirements. Your key responsibilities will involve installing, configuring, maintaining, and managing database systems, schema, tables, indexes, procedures, and permissions. You will also be tasked with creating backup/recovery of MySQL databases, ensuring successful SQL Transactional Replication, tracking performance metrics of maintained databases, identifying potential capacity/performance issues, and conducting regular performance tuning, indexing, and normalization of database systems. Additionally, you will develop and test SQL scripts for task automation and reporting, provide monitoring support for critical database servers and supporting services, and meet project deliverables for new or changes to existing technology within project requirements. Furthermore, you will be expected to install, migrate, and administrate MySQL to Mariadb, ensure minimal server downtime resulting from maintenance and improvements during non-production hours, identify opportunities to reduce costs while maintaining functionality and support, improve technology performance and stability, and enhance existing processes to increase productivity. Other responsibilities include managing db patching, db upgrade/migration, resource monitoring, requirement analysis, developing and documenting issue resolution knowledge base, and providing second and third level support for all database systems. Availability for on-call 24/7 support is also a requirement for this role. To qualify for this position, you should possess a Bachelor's degree in Information Technology, Computer Science, or a related discipline, along with a minimum of five years of experience in database administration. Additionally, you should have at least three years of administrator experience in an IT Enterprise organization that follows ITIL methodologies. Proficiency in installing, configuring, integrating, and administering MySQL within the Linux/LAMP stack is essential, along with hands-on experience supporting Highly Available (HA) database environments with clustering solutions. Experience in systems engineering design, analysis, integration, and life-cycle engineering for large information systems projects is also required. Preferred qualifications include experience in a multi-tiered architecture environment supporting B2B and/or hosting operations, familiarity with open-source software including LAMP software bundle, and working knowledge in NoSQL/MongoDb. Strong written and verbal communication skills, the ability to work effectively in a team, and excellent project management abilities are also desirable traits for this role.,
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough