Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 years
0 Lacs
India
Remote
Job Title: Senior Azure Data Engineer – Remote Contract Location: Remote Contract Type: Full-time Contract Experience Required: 7+ Years (including healthcare domain experience) Duration: [Specify Contract Duration – 6 months Start Date: Immediate About the Role We are seeking an experienced Azure Data Engineer with a proven track record in healthcare domain projects to join our remote team on a contract basis. The ideal candidate will have strong expertise in Microsoft Azure data services , big data processing , and ETL pipeline development . You will work closely with our analytics, BI, and cloud architecture teams to design, implement, and optimize secure, compliant, and scalable data solutions for healthcare applications. Key Responsibilities Design, develop, and maintain Azure Data Factory pipelines for ETL workflows. Build and optimize PySpark/Databricks scripts for large-scale healthcare data processing. Create and manage data lake and data warehouse solutions using Azure Data Lake Storage Gen2 and Azure Synapse Analytics. Integrate data from healthcare-specific sources such as EHR/EMR systems, HL7/FHIR APIs, and other medical data feeds. Implement Delta Lake for optimized big data storage and querying. Ensure data security, HIPAA compliance, and governance in all data workflows. Collaborate with BI teams to deliver analytical dashboards in Power BI for healthcare insights and reporting. Participate in Agile/Scrum ceremonies and maintain detailed technical documentation. Required Skills & Qualifications 7+ years of experience in Data Engineering , with at least 3 years in Azure Cloud . Mandatory: Minimum 2+ years of experience in healthcare domain projects with exposure to healthcare standards (HIPAA, HL7, FHIR, ICD codes). Proficiency in Azure Data Factory, Azure Synapse Analytics, Azure Databricks, and Azure Data Lake Storage . Strong programming skills in SQL, Python, and PySpark . Experience in data modeling (Star/Snowflake schema) and data warehousing concepts . Hands-on experience with Delta Lake, Apache Spark , and distributed data processing . Familiarity with CI/CD tools like Azure DevOps or GitHub Actions. Strong problem-solving skills and ability to work independently in a remote environment.
Posted 8 hours ago
0 years
0 Lacs
India
Remote
Data Science Intern (Paid) Company: WebBoost Solutions by UM Location: Remote Duration: 3 months Opportunity: Full-time based on performance, with a Certificate of Internship About WebBoost Solutions by UM WebBoost Solutions by UM provides aspiring professionals with hands-on experience in data science , offering real-world projects to develop and refine their analytical and machine learning skills for a successful career. Responsibilities ✅ Collect, preprocess, and analyze large datasets. ✅ Develop predictive models and machine learning algorithms . ✅ Perform exploratory data analysis (EDA) to extract meaningful insights. ✅ Create data visualizations and dashboards for effective communication of findings. ✅ Collaborate with cross-functional teams to deliver data-driven solutions . Requirements 🎓 Enrolled in or graduate of a program in Data Science, Computer Science, Statistics, or a related field . 🐍 Proficiency in Python for data analysis and modeling. 🧠 Knowledge of machine learning libraries such as scikit-learn, TensorFlow, or PyTorch (preferred) . 📊 Familiarity with data visualization tools (Tableau, Power BI, or Matplotlib) . 🧐 Strong analytical and problem-solving skills. 🗣 Excellent communication and teamwork abilities. Stipend & Benefits 💰 Stipend: ₹7,500 - ₹15,000 (Performance-Based). ✔ Hands-on experience in data science projects . ✔ Certificate of Internship & Letter of Recommendation . ✔ Opportunity to build a strong portfolio of data science models and applications. ✔ Potential for full-time employment based on performance. How to Apply 📩 Submit your resume and a cover letter with the subject line "Data Science Intern Application." 📅 Deadline: 14th August 2025 Equal Opportunity WebBoost Solutions by UM is committed to fostering an inclusive and diverse environment and encourages applications from all backgrounds. Let me know if you need any modifications! 🚀
Posted 8 hours ago
2.0 years
0 Lacs
Delhi, Delhi
On-site
Job Title: Subject Matter Expert (SME) Job Overview: The Subject Matter Expert (SME) is responsible for supporting business development and operational efficiency through expert consultation and service delivery. The SME drives revenue growth, ensures customer satisfaction, and enhances internal coordination. Success in this role involves timely client handling, clear communication, and high-quality data management. This role is integral to the Research Support Department, ensuring strategic alignment with the organization's goals. Responsibilities and Duties: Conduct pre-sale calls with clients to understand their requirements and propose suitable services. Clarify queries raised by Counsellors and CRMs regarding service offerings. Review and validate client documents to ensure accuracy and completeness. Address and resolve client concerns or doubts regarding services. Clearly explain all aspects of proposed work to clients. Assess the feasibility of proposed projects. Provide necessary tools or services for the successful implementation of projects. Encourage clients to opt for hypothetical data when real data is unavailable. Engage in cross-selling of services based on client requirements. Promote the organization’s complete range of services during client interactions. Maintain accurate, error-free data records of all client interactions and pre-sale activities. Coordinate with the PRM team to assess the technical viability of proposed work. Share vendor approvals and updates with Counsellors and CRMs. Explain feedback and comments on writing assignments to clients. Internally coordinate with various departments to fulfill project requirements. Support team members and assist with publication-related tasks when needed. Ensure smooth coordination between Counsellors, CRMs, and clients. Resolve operational difficulties faced by Counsellors and CRMs. Eligibilty Requirement Qualification - P.Hd Expertise ( Biotech / Mathematics/ Python/ Computer Science/ Medical) Minimum 2 years of experience in SME have knowledge about Biotech, Technology, python, Medical, Tools Knowledge Please share CV / referral to HR Sakshi Bhardwaj Human Resource Department 9821322533 sakshi.bhardwaj@aimlay.com 408, 4th Floor, D Mall, Sector-10, Rohini, Delhi - 110085 Job Types: Full-time, Permanent Pay: Up to ₹45,000.00 per month Benefits: Cell phone reimbursement Internet reimbursement Leave encashment Provident Fund Work Location: In person
Posted 8 hours ago
0 years
0 Lacs
India
Remote
About the Company: ZeTheta Algorithms Private Limited is a FinTech start-up which has been recently set up and is developing innovative AI tools. https://www.instagram.com/zetheta.official About the Role: The Back-End Web Developer Intern will be responsible for server-side application logic, database design, and building secure and efficient APIs. This includes password authentication systems, data processing logic, and integration with AI features. Responsibilities: Design, build and implement secure RESTful APIs using Node.js + Express.js Develop and maintain server-side applications using frameworks such as Django, Flask, Express (Node.js), or Laravel (PHP) Implement authentication and authorization systems (Firebase free tier/ PostgreSQL) Design and manage database schemas using MongoDB Atlas (free tier) or other databases with PostGRE SQL (Supabase) essential skill requirement. Create efficient data models and database queries Develop blockchain-based certificate verification system Implement server-side validation and security measures Troubleshoot and optimize backend performance Integrate third-party services and external APIs when needed Document API endpoints and backend functionality for front-end developers Qualifications: Basic proficiency in one or more backend frameworks (Node.js/Express, Django, Flask, Laravel) Programming skills in JavaScript, Python, PHP, or Ruby Familiarity with MongoDB, PostgreSQL (Supabase), SQLite, or NoSQL databases Understanding of RESTful API design principles Basic knowledge of authentication and authorization mechanisms Awareness of security concerns and best practices Ability to work independently in a remote setting Benefits: Opportunity to build practical skills with modern backend technologies Experience working with databases and API design Exposure to blockchain technology applications Enhance your professional portfolio with real-world projects Remote work flexibility Internship Details: Duration: Self paced with options of 1, 2, 3, 4 or 6 months. Type: Unpaid
Posted 8 hours ago
7.0 years
0 Lacs
India
On-site
We’re hiring: Senior Full Stack Engineer | ReactJS + GenAI Do you code like a pro AND think like an innovator? We’re on the hunt for a Senior Full Stack Engineer who can blend ReactJS mastery with GenAI-powered development to build next-gen, high-performance apps. Why this role is exciting: You’ll design, build, and scale apps end-to-end , using GenAI tools at every stage — from architecture & prototyping to testing & deployment. Think cutting-edge, cloud-native, microservices, serverless, RAG & multi-agent magic . You bring: ✅ 7+ years full stack development experience ✅ Strong in ReactJS & at least one backend tech (Java / .NET / Python / Node.js) ✅ Cloud (AWS / Azure / GCP) + Docker/Kubernetes + SQL & NoSQL ✅ Hands-on with GenAI frameworks (LlamaIndex, LangChain a plus) Apply now → Send your resume for misbah@jaconsulting.in with below details. Total Experience Current CTC Expected CTC Notice Period If AI is your co-pilot and code is your craft, this is your runway. 🛫
Posted 8 hours ago
0 years
0 Lacs
India
On-site
About the Company ZeTheta Algorithms Private Limited (www.zetheta.com) is a FinTech start-up which has been recently set up and is developing innovative AI tools. https://www.instagram.com/zetheta.official/ About the Role We are seeking a talented and motivated student intern to develop AI models from scratch within a 2 month timeframe. This is an extraordinary opportunity for a self-driven, technically skilled student to build a complete enterprise-grade system that incorporates cutting-edge AI technologies including deep learning. Responsibilities Design and implement the complete AI Models following provided specifications Develop both frontend and backend components of the platform Create and integrate AI modules Design and optimize database schema and queries Ensure system security, especially for sensitive identity verification data Deploy the solution using zero-cost cloud services Document code, APIs, and system architecture Complete the entire project within the 2 month timeline Qualifications A student who is currently pursuing or recently completed a degree in Computer Science, AI/ML, or related field. Required Skills Strong programming skills in Python and JavaScript Experience with React.js for frontend development Knowledge of FastAPI or similar Python web frameworks Familiarity with database design and SQL (particularly PostgreSQL) Understanding of basic machine learning concepts Self-motivated with ability to work independently Preferred Skills Sound knowledge of AI/ML libraries (TensorFlow, Keras, HuggingFace Transformers) Familiarity with computer vision techniques using OpenCV Knowledge of natural language processing concepts Sound knowledge of facial recognition or emotion detection systems Understanding of identity verification methodologies Docker containerization experience Prior work with cloud deployment (AWS) Who Should Apply? Student or fresh graduate from any academic discipline. Strong analytical and problem-solving skills. Basic knowledge of Microsoft Office. Willingness to self-learn and work in a fast-paced environment. Internship Details Duration: Self paced, options for 1,2,3,4 or 6 months. Type: Unpaid
Posted 8 hours ago
5.0 years
0 Lacs
India
Remote
Job Title: Azure Data Engineer Experience: 5+ Years Location: Remote (1–2 days/month at any client office – PAN India) Occasional travel: 1–2 days/month to any nearby client office location-Noida, Gurgaon, Indore, Bangalore, Hyderabad, Pune, Chennai Job Description: We are seeking experienced Azure Data Engineers with strong expertise in modern data platforms and cloud technologies. The ideal candidate should have hands-on experience with Azure Databricks, Azure Data Factory (ADF), SQL, Snowflake, Python, and traditional ETL/Data Warehousing concepts. Key Responsibilities: Design, develop, and maintain scalable data pipelines using Azure Data Factory and Databricks. Perform data ingestion, transformation, and orchestration using Spark and Python. Develop and optimize complex SQL queries for data transformation and reporting. Integrate with Snowflake and other cloud data platforms for advanced analytics. Collaborate with data architects, analysts, and business teams to deliver end-to-end solutions. Monitor and troubleshoot performance and reliability issues in data pipelines. Ensure data quality, consistency, and security across the data lifecycle. Required Skills: 5+ years of experience in data engineering or related roles. Strong hands-on experience with: Azure Databricks (Spark-based transformations) Azure Data Factory (ADF pipelines, triggers, integration runtime) SQL (T-SQL/PL-SQL) Python (for scripting and data processing) ETL tools and Data Warehousing (DWH) concepts Snowflake. Experience with version control tools like Git and CI/CD practices. Good understanding of data modeling, partitioning, and performance tuning. Nice to Have: Knowledge of Azure Synapse, Azure Data Lake Storage (ADLS), Power BI. Experience working in Agile delivery environments. Azure Data Engineer Associate or equivalent certification.
Posted 8 hours ago
0 years
0 Lacs
India
Remote
Job Title: Software Engineer Associate (Freshers - Internship) Company: TechKnowledgeHub.org Location: Remote About TechKnowledgeHub.org: TechKnowledgeHub.org is a dynamic and innovative organization dedicated to empowering individuals through technology education. We are committed to bridging the digital divide and providing learning opportunities for all. As we continue to grow, we are excited to offer an excellent learning opportunity for aspiring software developers. Position Overview: We are seeking highly motivated and passionate individuals to join us as Software Developer Associates. This is an unpaid internship position designed for freshers who are eager to kickstart their careers in software development. Successful candidates will have the chance to work on real projects, gain hands-on experience, and receive mentorship from industry professionals. Key Responsibilities: Collaborate with the development team to create and maintain software solutions Participate in the entire software development life cycle Write well-designed, efficient, and testable code Assist in troubleshooting, debugging, and optimizing software applications Keep abreast of the latest industry trends and technologies Requirements: Bachelor’s degree in Computer Science, Information Technology, or related field Strong problem-solving skills and attention to detail Knowledge of programming languages such as Python, Java, or JavaScript Good understanding of software development principles and practices Excellent communication and teamwork skills Internship Duration: The initial internship period will be unpaid for the first few months. Based on your performance, dedication, and contribution to the team, there is the potential for a transition to a permanent job role with a competitive salary. Benefits: Hands-on experience in software development Mentorship from experienced professionals Exposure to real-world projects and challenges Opportunity for career growth within the organization How to Apply: Interested candidates are invited to submit their resumes and cover letters to resume@techknowledgehub.org with the subject line "Software Developer Associate Application - [Your Full Name]". Please include a brief statement about why you are passionate about software development and how this internship aligns with your career goals. Application Deadline: TechKnowledgeHub.org is an equal opportunity employer. We encourage candidates from all backgrounds to apply. Note: Only shortlisted candidates will be contacted for interviews.
Posted 8 hours ago
125.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Company Description Experian unlocks the power of data to create opportunities for consumers, businesses and society. We gather and analyze data in ways others can't. We help individuals take financial control and access financial services, businesses make smarter decision and excel (can maintain complex spreadsheets), lenders lend more responsibly, and organizations prevent identity fraud and crime. For more than 125 years, we've helped consumers and clients prosper, and economies and communities flourish – and we're not done. Our 17,800 people in 45 countries believe the possibilities for you, and our world, are growing. We're investing in new technologies, accomplished people and progress so we can help create a better tomorrow. Job Description Key Responsibilities: Design, monitor, and maintain batch data processing pipelines. Analyze and validate large volumes of data from multiple sources. Ensure data accuracy, consistency, and timely delivery to internal and external stakeholders. Collaborate with cross-functional teams to understand data requirements and implement solutions. Troubleshoot and resolve issues in batch jobs and data workflows. Document data flow processes, logic, and business rules. Support automation and optimization of batch processes using scripting and tools. Perform root cause analysis for data discrepancies and recommend corrective actions. Maintain compliance with data governance and security standards. Qualifications Required Skills & Qualifications: Bachelor’s degree in Computer Science, Information Systems, Statistics, or related field. 2–5 years of experience in data analysis and batch processing. Proficiency in SQL and scripting languages (e.g., Python, Shell). Excellent communication and documentation skills. Ability to work independently and manage multiple tasks simultaneously. Preferred Qualifications Experience with cloud platforms (e.g., AWS, Azure, GCP). Knowledge of data visualization tools (e.g., Power BI, Tableau). Understanding of data warehousing concepts and tools Additional Information Benefits package includes: Great compensation package and discretionary bonus plan Core benefits include pension, Bupa healthcare, Sharesave scheme and more! 25 days annual leave with 8 bank holidays and 3 volunteering days. You can also purchase additional annual leave. Experian Careers - Creating a better tomorrow together Find out what its like to work for Experian by clicking here
Posted 8 hours ago
0 years
0 Lacs
India
Remote
About the Company ZeTheta Algorithms Private Limited is a FinTech start-up which has been recently set up and is developing innovative AI tools. https://www.instagram.com/zetheta.official About the Role As a Data Scientist intern, you will work on cutting-edge projects involving financial data analysis, investment research, and risk modelling. You will have the opportunity to engage in multiple mini-projects or take up a focused innovation-based research project. The project experience is designed to provide practical exposure to data science in the context of asset management, trading, and financial technology. We provide problem statements, methodology and after you submit your solution to develop the solutions/ model, we also showcase to you sample solution. You can use our sample solution to modify your project submission and expand further based on suggestions given in our sample solution. You can opt for your own research based data science solution to develop/ model. Responsibilities Conduct data cleaning, wrangling, and pre-processing for financial datasets. Assist investment teams in equity research, fixed income research, portfolio management, and economic analysis. Apply statistical techniques to financial problems such as credit risk modelling, probability of default, and value-at-risk estimation. Work with big data sources including financial reports, macroeconomic datasets, and alternative investment data. Use either one – Python, Excel or R to analyse, visualize, and model financial data. Participate in research projects related to quantitative trading, financial derivatives, and portfolio optimization. Who Should Apply? Any student even without coding skills can upskill (self learning) to develop Data Science Solutions. Some basic knowledge of Excel or Python or R script can help complete the projects quicker. We permit the use of all LLMs/ NLPs to help students to develop the solutions. Strong problem-solving and analytical skills. Able to self-learn and work independently in a remote, flexible environment. Internship Details Duration: Option of 1 month, 2 month, 3 month, 4 month or 6 months Timing: Self-paced. Type: Unpaid
Posted 8 hours ago
0 years
0 Lacs
India
On-site
About the Company ZeTheta Algorithms Private Limited is a FinTech start-up which has been recently set up and is developing innovative AI tools. https://www.instagram.com/zetheta.official/ About the Role We are seeking a talented and motivated student intern for a Data Analyst role. This is an extraordinary opportunity for a self-driven, analytically minded student with a passion for extracting meaningful insights from complex datasets and contributing to data-driven decision making. About the Internship: As a Data Analyst intern, you will work on cutting-edge projects involving data collection, processing, analysis, and visualization. You will have the opportunity to engage in multiple mini-projects across various functional areas of the company. The internship is designed to provide practical exposure to real-world data analysis and business intelligence. Key Responsibilities As part of the internship, you will work through a structured set of assignments designed to enhance your understanding of data analytics methodologies and applications. Your primary responsibilities will include: Data Collection & Processing Gather data from various sources and databases Clean and pre-process data to ensure quality and usability Develop and maintain data pipelines Implement data validation procedures and quality checks Data Analysis & Modelling Apply statistical methods to interpret datasets Identify patterns, trends, and correlations in complex data Develop predictive models and conduct hypothesis testing Create data-driven solutions for business problems Data Visualization & Reporting Create interactive dashboards and visual reports Develop compelling data visualizations to communicate findings Present analytical insights to stakeholders Automate regular reporting processes Business Intelligence Transform raw data into actionable business insights Support decision-making processes with data-backed recommendations Monitor key performance indicators and metrics Identify opportunities for process optimization Capstone Project Complete an end-to-end data analysis project including: Problem definition, Data collection and preparation, Analysis and modelling, Visualization of results, and Recommendations based on findings What You Will Learn Practical experience with data analysis tools and techniques Hands-on skills in data visualization and dashboard creation Strong analytical and problem-solving abilities Effective communication of technical findings to non-technical audiences Knowledge of AI and machine learning applications in data analytics Who Should Apply? Student or fresh graduate from any academic discipline Strong analytical and quantitative skills Basic knowledge of data analysis tools (Excel, SQL, Python, or R) Interest in working with data and deriving insights Willingness to self-learn and work in a fast-paced environment Internship Details Duration: Self-paced. Option of 1 month or 2 months within a period of 4 months provided. Type: Unpaid
Posted 9 hours ago
0.0 - 2.0 years
0 Lacs
India
Remote
Role: Junior DevOps Engineer Location: Remote ,India Timings: Full Time (As per company timings) Notice Period: within 15 days or immediate joiner Experience: 0-2 years About The Role Must have skills: Should be good with Linux / windows Good understanding of Azure / AWS services Good Knowledge and experience in Ansible Good knowledge of docker and kubernetes Roles And Responsibilities You will be responsible for maintaining the DevOps practices in the data centre and cloud. As a DevOps Engineer, you will need to understand DevOps practices, IT, Microservices, Kubernetes, Docker, Jenkins, and Monitoring strategies. Added advantage if you have scripting experience i.e. docker files, C.I/C. D pipelines, shell, and python scripting Assist in tasks for CICD tools like Jenkins, Gitlab, and many others. Support Automation and infrastructure as code (LaC)tools Work with Linux Operating System and cloud environment Working on ways to automate and improve development and release processes Ensuring that systems are secure against cybersecurity threats. Other Personal Characteristics Dynamic, engaging, self-reliant developer Ability to deal with ambiguity Manage a collaborative and analytical approach Self-confident and humble Open to continuous learning Intelligent, rigorous thinker who can operate successfully amongst bright people Be equally comfortable and capable interacting with technologists as they are with business executives.
Posted 9 hours ago
6.0 - 10.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
IMMEDIATE JOINERS ONLY Job Title : Product Owner Location : Hyderabad (Hybrid) Exp -6-10 years Domain Mandatory - Securities settlements domain Domain Knowledge Mandatory - Capital Market/OTC Please note securities and capital market both domain latest and strong experience is mandatory . Roles and Responsibilities working in agile environments with strong understanding of agile delivery frameworks and product management delivery of complex products to clients within the financial industry experience as a Product Manager in the financial sector, ideally in the Investment Bank and Securities domain experience working with applications supporting Securities settlements & confirmations, preferably US securities settlements through DTCC etc. experience working with settlement & confirmations business/operations stakeholders, documenting as-is and to-be business & process flows, process engineering etc. setting and delivering on outcome-focused goals, such as using OKRs (Objectives and Key results) team player with an enthusiastic and proactive personality, a true agile mindset, and strong analytical and problem-solving skills strong active listening and communication skills to build networks and partnerships at all levels Data Analytics Alteryx, SQL , Python experience Experience working with data and understanding data.
Posted 9 hours ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Company Description Experian is a global data and technology company, powering opportunities for people and businesses around the world. We help to redefine lending practices, uncover and prevent fraud, simplify healthcare, create digital marketing solutions, and gain deeper insights into the automotive market, all using our unique combination of data, analytics and software. We also assist millions of people to realise their financial goals and help them to save time and money. We operate across a range of markets, from financial services to healthcare, automotive, Agri finance, insurance, and many more industry segments. We invest in experienced people and new advanced technologies to unlock the power of data and to innovate. A FTSE 100 Index company listed on the London Stock Exchange (EXPN), we have a team of 23,300 people across 32 countries. Our corporate headquarters are in Dublin, Ireland. Learn more at experianplc.com . Job Description Job description Data Engineer We are looking for a passionate Data Engineer to join one of our experienced IT teams within Experian. The Team: As a member of one of our core development product teams, you will develop products that underpin BI Commercial Credit offering. You will be involved in multiple tasks spreading from database maintenance to application development using several languages and tools. You will work with your team's colleagues and will also support other part of Experian like teams from other Departments, client delivery analysts, Database administrators, other developing teams working on projects using your data. You will manage the security of that data and platforms within the team. You will be reporting to a Manager. Our Tech: We're a proud tech team who maintains large databases that are serving the Experian BI Delivery Team. Our expertise spreads not just to data management but also to application development to package and publish this data. We are using Oracle and Exadata to host our data and PL/SQL alongside Python in our ETL processes. Our applications are developed in Python or Java and hosted in the AWS cloud. We are involved in any stage of development from design to testing and final deployment. We are using all the tools provided by Experian (Jenkins, Harness) to ensure efficient DevOps approach to automated deployment. Qualifications Technical Skills: Oracle SQL / PLSQL experience on enterprise databases Data engineering for ETL of large datasets Java and Python knowledge Knowledge of AWS Cloud for data storage and API hosting CoaaS / Kubernetes CICD tools i.e. Jenkins, Harness, Artifactory Experience/ Skills: You will establish, trust-based relationships Design solutions and communicate with partners to make the best decision You will coach, help, review codes for other developers or teams Will to develop and to share the knowledge acquired with junior colleagues Improve existing data engineering processes or methodologies to increase effectiveness Knowledge of working with and modelling financial or credit data (desirable) Work in a financial regulated environment (desirable) Qualifications oracle pl/sql data engineer database developer Additional Information Our uniqueness is that we celebrate yours. Experian's culture and people are important differentiators. We take our people agenda very seriously and focus on what matters; DEI, work/life balance, development, authenticity, engagement, collaboration, wellness, reward & recognition, volunteering... the list goes on. Experian's people first approach is award-winning; Great Place To Work™ in 24 countries, FORTUNE Best Companies to work and Glassdoor Best Places to Work (globally 4.4 Stars) to name a few. Check out Experian Life on social or our Careers Site to understand why. Experian is proud to be an Equal Opportunity and Affirmative Action employer. Innovation is a critical part of Experian's DNA and practices, and our diverse workforce drives our success. Everyone can succeed at Experian and bring their whole self to work, irrespective of their gender, ethnicity, religion, color, sexuality, physical ability or age. If you have a disability or special need that requires accommodation, please let us know at the earliest opportunity. Experian Careers - Creating a better tomorrow together Benefits Experian care for employee's work life balance, health, safety and wellbeing. To support this endeavor, we offer the best family well-being benefits, enhanced medical benefits and paid time off. Experian Careers - Creating a better tomorrow together Find out what its like to work for Experian by clicking here
Posted 9 hours ago
0 years
0 Lacs
Gwalior, Madhya Pradesh, India
On-site
Required on Immediate Basis: Location Gwalior Trainee - College Students Key Responsibilities: Curriculum Development: Designing and developing comprehensive training programs covering various aspects of Full Stack Development, including front-end (HTML, CSS, JavaScript, frameworks like React, Angular, or Vue), back-end (Node.js, Python, Java, databases like PostgreSQL or MongoDB), and deployment strategies. Training Delivery: Conducting engaging and effective training sessions, both in-person and online, using a variety of teaching methodologies to cater to different learning preferences. Training Material Creation: Developing and updating training materials, including presentations, hands-on exercises, quizzes, and assessments, ensuring they are relevant, up-to-date, and aligned with industry best practices. Progress Tracking and Assessment: Monitoring and evaluating the progress of trainees, providing constructive feedback, and identifying areas for improvement. Staying Updated: Staying abreast of the latest trends and technologies in Full Stack Development, participating in relevant workshops and conferences, and incorporating new knowledge into training programs. Collaboration: Working closely with other teams, such as development, product, and QA, to understand training needs and ensure alignment with project requirements. Mentorship: Providing mentorship and guidance to trainees, helping them overcome technical challenges and fostering their growth as developers. Technical Support: Assisting with technical roadblocks faced by the development team, ensuring smooth project execution. Required Skills and Qualifications: Technical Expertise: Strong proficiency in Full Stack Development, including front-end and back-end technologies, databases, and deployment strategies. Communication Skills: Excellent verbal and written communication skills, with the ability to explain complex technical concepts clearly and concisely. Presentation Skills: Ability to deliver engaging and informative presentations to diverse audiences. Instructional Design: Experience in designing and developing training materials, including lesson plans, presentations, and assessments. Problem-Solving Skills: Ability to identify and resolve technical issues encountered by trainees. Adaptability: Ability to adapt training methodologies and materials to different learning styles and technical backgrounds. Mentorship Skills: Ability to guide and mentor trainees, fostering their professional development.
Posted 9 hours ago
10.0 years
0 Lacs
Kochi, Kerala, India
On-site
What You’ll Do Architect and deliver high-performance, scalable, and maintainable software systems across cloud and on-prem environments. Design end-to-end solutions leveraging modern microservices, hybrid monolithic architectures , and reactive patterns. Champion best practices in backend engineering using Java, Python, Node.js, PHP , and build seamless integrations with Angular/React frontends. Lead and mentor development teams—helping them adopt new technologies, tools, and frameworks through PoCs and hands-on guidance. Ensure application resilience, performance, and fault tolerance during deployment and scaling. Drive excellence in DevOps , CI/CD pipelines, and cloud infrastructure (AWS, GCP, Azure). Own software architecture artifacts —from HLD to LLD and SAD documentation. Collaborate across teams to build cloud-native SaaS/PAAS platforms , implement SSO/Identity Management , and maintain high code quality through automation. Architect robust data systems : from SQL, NoSQL to Graph DBs and manage asynchronous/event-driven communication. What We’re Looking For 10+ years of experience in software architecture, backend systems, and full-stack development. Proven expertise in designing distributed, load-bearing systems and migrating monoliths to microservices. Strong knowledge of Domain-Driven Design (DDD), TDD/BDD, CQRS, and container orchestration. Hands-on experience with DevOps , cloud platforms, and implementing robust code quality measures. Strategic thinker with the ability to estimate, plan, and drive technical execution . Solid experience in implementing scalable, secure, and resilient enterprise systems. Preferred Profile Bachelor’s degree in Engineering or equivalent. A consistent track record of technical leadership and delivery success . Passion for continuous learning, innovation, and mentoring teams . Strong communication and stakeholder management skills.
Posted 9 hours ago
3.0 years
0 Lacs
Kozhikode, Kerala, India
On-site
🔍 Job Title: Data Analyst (Mid-Senior Level) 📍 Location: Tiruppur/ Perundurai 🕒 Experience: 2–3 Years 🗣️ Language: Hindi (Preferred) 📄 Employment Type: Full-Time 🚀 About the Role Are you passionate about turning data into powerful business insights? We’re looking for a Data Analyst who thrives on numbers, loves automation, and can create impact through clear, data-driven decisions. If you have 2–3 years of experience and a sharp eye for detail, this is your chance to shine! 🌟 📊 What You’ll Do 🔍 Analyze large datasets to identify trends, patterns, and actionable insights 📈 Build interactive dashboards and reports using Power BI & Google Sheets 🤖 Automate tasks and streamline workflows with Python scripts & Excel macros 🤝 Collaborate with teams to gather requirements and translate them into data solutions 🧹 Ensure data accuracy, consistency, and integrity across systems 🗣 Present findings clearly to both technical and non-technical stakeholders 💡 What You Bring ✅ Advanced Google Sheets – Complex formulas, data validation, pivot tables, scripts ✅ Advanced Excel – Functions, lookups, pivot charts, VBA/macros ✅ Power BI (Intermediate) – Dashboards, DAX, data modeling ✅ Python (Intermediate) – Data analysis with Pandas, NumPy, etc. 🎯 Strong analytical thinking and attention to detail 🗣 Fluent in English; Hindi proficiency preferred 🎓 Qualifications Bachelor’s degree in Statistics , Math , Computer Science , or related field 2–3 years of hands-on experience in a data-focused role 🙌 Why Join Us? 🚀 Fast-paced, collaborative work environment 🧠 Opportunities to grow and learn new tools Ready to make your next big career move? Apply now and let data tell your story! 📩
Posted 9 hours ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Work Level : Individual Core : Communication Skills, Problem Solving, Execution Leadership : Decisive, Team Alignment, Working Independently Industry Type : IT Services & Consulting Function : Data Analyst Key Skills : MySQL,Python,Bigdata,Data Science,Data Analytics,Data Analysis,Cloud,AWS,Business Intelligence (BI),Statistical Modeling,R,Big Data Platforms,Tableau Education : Graduate Note: This is a requirement for one of the Workassist Hiring Partner. Requirement: Currently pursuing or recently completed a degree in Data Science, Statistics, Mathematics, Computer Science, or a related field. Strong analytical and problem-solving skills. Proficiency in Excel and SQL for data analysis. Experience with data visualization tools like Power BI, Tableau, or Google Data Studio. Basic knowledge of Python or R for data analysis is a plus. Understanding of statistical methods and data modeling concepts. Strong attention to detail and ability to work independently. Excellent communication skills to present insights clearly. Preferred Skills: Experience with big data technologies (Google BigQuery, AWS, etc.). Familiarity with machine learning techniques and predictive modeling. Knowledge of business intelligence (BI) tools and reporting frameworks. Company Description Workassist is an online recruitment and employment solution platform based in Lucknow, India. We provide relevant profiles to employers and connect job seekers with the best opportunities across various industries. With a network of over 10,000+ recruiters, we help employers recruit talented individuals from sectors such as Banking & Finance, Consulting, Sales & Marketing, HR, IT, Operations, and Legal. We have adapted to the new normal and strive to provide a seamless job search experience for job seekers worldwide. Our goal is to enhance the job seeking experience by leveraging technology and matching job seekers with the right employers. For a seamless job search experience, visit our website: https://bit.ly/3QBfBU2 (Note: There are many more opportunities apart from this on the portal. Depending on the skills, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 9 hours ago
1.0 years
1 - 5 Lacs
Thevara, Kochi, Kerala
On-site
Full Stack Developer Job Title: Full Stack Developer Company: Matrix Sec Cyber Solutions LLP Location: Kochi, Kerala Employment Type: Full-Time (On-site) Experience Required: Minimum 1 Year in Full Stack Development About Matrix Sec Matrix Sec Cyber Solutions LLP is a Kochi-based company specializing in cybersecurity, application and web development, analytics, and digital reputation management. We merge innovation with reliability to deliver secure, scalable, and high-performance digital ecosystems for clients across industries. Website: www.matrixsec.in Position Overview We are seeking a skilled and detail-oriented Full Stack Developer to join our technical team. The ideal candidate will be proficient in both front-end and back-end development, capable of building robust, scalable, and secure applications from start to finish. Key Responsibilities Design, develop, and maintain both front-end and back-end components of applications Build responsive, user-friendly UIs using HTML, CSS, JavaScript , and frameworks such as React, Angular, or Vue.js Develop and manage back-end logic using Node.js, Python, PHP, or Java Create and manage RESTful and/or GraphQL APIs for seamless data flow Handle database design and queries ( MySQL, PostgreSQL, MongoDB , etc.) Optimize applications for performance, scalability, and security Collaborate with designers, developers, and project managers to deliver end-to-end solutions Implement security best practices , including authentication, authorization, and data protection Troubleshoot and resolve cross-platform issues Maintain clear documentation of code, modules, and architecture Required Qualifications Bachelor’s degree in Computer Science, Engineering , or related field Minimum 1 year of professional full stack development experience Proficiency in front-end technologies ( HTML, CSS, JavaScript ) and frameworks ( React, Angular, or Vue.js ) Proficiency in at least one back-end language ( Node.js, Python, PHP, Java ) Experience with SQL/NoSQL databases and ORM frameworks Familiarity with Git or other version control tools Strong problem-solving and debugging skills Understanding of API security (JWT, OAuth2) and deployment tools Bonus Skills: Experience with Docker, Kubernetes Knowledge of CI/CD pipelines Familiarity with cloud services ( AWS, Azure, GCP ) Interview Details Date: Wednesday, 10 September 2025 Time: 11:00 AM IST Venue: Matrix Sec Cyber Solutions LLP Address: 2nd Floor, Zareen Complex, Luiz Lane, near Thevara Market, Perumanoor, Kochi, Ernakulam, Kerala – 682015 Note: This is an in-person interview. Please arrive at the scheduled time. Contact: +91 9746970442 How to Apply Send your resume to: info@matrixsec.in Subject Line: Application – Full Stack Developer – September 2025 Job Type: Full-time Pay: ₹100,000.00 - ₹500,000.00 per year Work Location: In person
Posted 9 hours ago
1.0 years
0 Lacs
Pune/Pimpri-Chinchwad Area
Remote
Our Mission At Palo Alto Networks® everything starts and ends with our mission: Being the cybersecurity partner of choice, protecting our digital way of life. Our vision is a world where each day is safer and more secure than the one before. We are a company built on the foundation of challenging and disrupting the way things are done, and we’re looking for innovators who are as committed to shaping the future of cybersecurity as we are. Who We Are We take our mission of protecting the digital way of life seriously. We are relentless in protecting our customers and we believe that the unique ideas of every member of our team contributes to our collective success. Our values were crowdsourced by employees and are brought to life through each of us everyday - from disruptive innovation and collaboration, to execution. From showing up for each other with integrity to creating an environment where we all feel included. As a member of our team, you will be shaping the future of cybersecurity. We work fast, value ongoing learning, and we respect each employee as a unique individual. Knowing we all have different needs, our development and personal wellbeing programs are designed to give you choice in how you are supported. This includes our FLEXBenefits wellbeing spending account with over 1,000 eligible items selected by employees, our mental and financial health resources, and our personalized learning opportunities - just to name a few! At Palo Alto Networks, we believe in the power of collaboration and value in-person interactions. This is why our employees generally work full time from our office with flexibility offered where needed. This setup fosters casual conversations, problem-solving, and trusted relationships. Our goal is to create an environment where we all win with precision. Job Description Your Career Prisma Access™ (formally GlobalProtect Cloud Service) provides protection straight from the cloud to make access to the cloud secure. It combines the connectivity and security you need and delivers it everywhere you need it. Using cutting-edge public and private cloud technologies extending the next-generation security protection to all cloud services, customers on-premise remote networks and mobile users. We are seeking an experienced Software Engineer to design, develop and deliver next-generation technologies within our Prisma Access team. We want passionate engineers who love to code and build great products. Engineers who bring new ideas in all facets of software development. We are looking for leaders who take ownership of their areas of focus and who are driven to solve problems at every level. Collaboration and teamwork are at the foundation of our culture and we need engineers who can communicate at a high level and work well with others towards achieving a common goal. Your Impact Design and implement new features and integrations for virtualization features across diverse cloud environments and deployments.E ngage in all phases of the product development cycle from concept definition, design, through implementation, and testing.D evelop comprehensive functional specifications, evaluate task requirements and timelines, and contribute to design, development, debugging, and support processes.H ands-on experience with virtualization technologies, various hypervisors, system software, and networking.C ustomer First Mindset is required and a very good team player. Be a cultural champion and role model for others showcasing the org valuesW ork with different development and quality assurance groups to achieve the best qualityW ork with DevOps and technical support teams to troubleshoot and fix customer reported issuesW ork with other software development team to apply PanOS features on Prisma Access Q ualificationsY our Experience B achelors/Masters in Computer Science or a related field required1 1-15 years of experience in developing data-plane applicationsP roficiency in C/C++ programming languages, with a strong emphasis on Linux.S trong Data structures/Algorithms & debuggingS trong knowledge in network security fields like stateful firewall, packet processing, and network ACL.E xperience with building applications in the cloudE xperience developing modules based on the network stack, OSI and TCP/IP models L4 - L7.N ice to have hands-on programming experience in Python and GoI n-depth understanding of Operating System principles and OS like Linux/UnixI n-depth understanding of networking concepts and TCP/IP stack, TLSE xposure to building Microservices E xperience with virtualization platforms (e.g., VMware, OpenStack, Kubernetes) is a plus.E xperience with deployment on cloud environments (OCI/AWS/GCP)E njoys working with many different teams with strong collaboration and communication skillsS olid foundation in design, data structures, and algorithms, and strong analytical and debugging skillsE xperience in mentoring and guiding junior team members in high performing teams.A dditional InformationT he TeamO ur engineering team is at the core of our products – connected directly to the mission of preventing cyberattacks. We are constantly innovating – challenging the way we, and the industry, think about cybersecurity. Our engineers don’t shy away from building products to solve problems no one has pursued before.W e define the industry, instead of waiting for directions. We need individuals who feel comfortable in ambiguity, excited by the prospect of a challenge, and empowered by the unknown risks facing our everyday lives that are only enabled by a secure digital environment.O ur Commitment We’re problem solvers that take risks and challenge cybersecurity’s status quo. It’s simple: we can’t accomplish our mission without diverse teams innovating, together.W e are committed to providing reasonable accommodations for all qualified individuals with a disability. If you require assistance or accommodation due to a disability or special need, please contact us at accommodations@paloaltonetworks.com.P alo Alto Networks is an equal opportunity employer. We celebrate diversity in our workplace, and all qualified applicants will receive consideration for employment without regard to age, ancestry, color, family or medical care leave, gender identity or expression, genetic information, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran status, race, religion, sex (including pregnancy), sexual orientation, or other legally protected characteristics.A ll your information will be kept confidential according to EEO guidelines.
Posted 9 hours ago
13.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Job Title: Senior Data Engineer (AWS, Python, SQL, Big Data) Experience: 8 – 13 Years Location: Gurgaon Interview Mode: Face-to-Face Drive — 30th August Domain Preference: Financial services industry experience preferred Notice Period: 60 to 90 Days Joining: Immediate joiners will also be considered if available Position Overview We are seeking an experienced Senior Data Engineer with strong expertise in AWS, Python, SQL, and Big Data technologies , along with modern data pipeline development experience. The successful candidate will design, develop, and maintain scalable data engineering solutions, contribute to CI/CD delivery pipelines, and play a key role in analytics and data-driven development within a fast-paced enterprise environment. Key Capabilities Passion for technology and keeping up with the latest trends Ability to articulate complex technical issues and system enhancements Proven analytical and evidence-based decision-making skills Strong problem-solving, troubleshooting, and documentation abilities Excellent written and verbal communication skills Effective collaboration and interpersonal skills High delivery focus with commitment to quality and auditability Ability to self-manage and work in a fast-paced environment Agile software development practices Desired Skills & Experience Hands-on experience in SQL and Big Data SQL variants ( HiveQL, Snowflake ANSI, Redshift SQL ) Expertise in Python , Spark (PySpark, Spark SQL, Scala), and Bash/Shell scripting Experience with source code control tools ( GitHub, VSTS, BitBucket ) Familiarity with Big Data technologies: Hadoop stack (HDFS, Hive, Impala, Spark) and cloud warehouses ( AWS Redshift, Snowflake ) Unix/Linux command-line experience AWS services exposure: EMR, Glue, Athena, Data Pipeline, Lambda Knowledge of Data Models (Star Schema, Data Vault 2.0) Essential Experience 8–13 years of technical experience, preferably in the financial services industry Strong background in Data Engineering/BI/Software Development, ELT/ETL, and data transformation in Data Lake / Data Warehouse / Lake House environments Programming with Python, SQL, Unix Shell scripts, and PySpark in enterprise-scale environments Experience in configuration management ( Ansible, Jenkins, Git ) Cloud design and development experience with AWS and Azure Proficiency with AWS services ( S3, EC2, EMR, SNS, SQS, Lambda, Redshift ) Building data pipelines on Databricks Delta Lake from databases, flat files, and streaming sources CI/CD pipeline automation ( Jenkins, Docker ) Experience with Terraform, Kubernetes, and Docker RDBMS experience: Oracle, MS SQL, DB2, PostgreSQL, MySQL – including performance tuning and stored procedures Knowledge of Power BI (recommended) Qualification Requirements Bachelor’s or Master’s degree in a technology-related discipline (Computer Science, IT, Data Engineering, etc.) Key Accountabilities Design, develop, test, deploy, maintain, and improve software and data solutions Create technical documentation, flowcharts, and layouts to define solution requirements Write clean, high-quality, testable code Integrate software components into fully functional platforms Apply best practices for CI/CD and cloud-based deployments Mentor other team members and share data engineering best practices Troubleshoot, debug, and upgrade existing solutions Ensure compliance with industry standards and regulatory requirements Interview Drive Details Mode: Face-to-Face (F2F) Date: 30th August Location: Gurgaon Notice Period: 60 to 90 Days (Immediate joiners considered as a plus)
Posted 9 hours ago
4.0 - 11.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Hello, Greeting from Quess Corp!! Hope you are doing well we have job opportunity with one of our client Designation_ Data Engineer Location – Gurugram Experience – 8yrs to 15 Yrs Qualification – Graduate / PG ( IT) Skill Set – Data Engineer, Python, AWS, SQL Essential capabilities Enthusiasm for technology, keeping up with latest trends Ability to articulate complex technical issues and desired outcomes of system enhancements Proven analytical skills and evidence-based decision making Excellent problem solving, troubleshooting & documentation skills Strong written and verbal communication skills Excellent collaboration and interpersonal skills Strong delivery focus with an active approach to quality and auditability Ability to work under pressure and excel within a fast-paced environment Ability to self-manage tasks Agile software development practices Desired Experience Hands on in SQL and its Big Data variants (Hive-QL, Snowflake ANSI, Redshift SQL) Python and Spark and one or more of its API (PySpark, Spark SQL, Scala), Bash/Shell scripting Experience with Source code control - GitHub, VSTS etc. Knowledge and exposure to Big Data technologies Hadoop stack such as HDFS, Hive, Impala, Spark etc, and cloud Big Data warehouses - RedShift, Snowflake etc. Experience with UNIX command-line tools. Exposure to AWS technologies including EMR, Glue, Athena, Data Pipeline, Lambda, etc Understanding and ability to translate/physicalise Data Models (Star Schema, Data Vault 2.0 etc) Essential Experience It is expected that the role holder will most likely have the following qualifications and experience 4-11 years technical experience (within financial services industry preferred) Technical Domain experience (Subject Matter Expertise in Technology or Tools) Solid experience, knowledge and skills in Data Engineering, BI/software development such as ELT/ETL, data extraction and manipulation in Data Lake/Data Warehouse/Lake House environment. Hands on programming experience in writing Python, SQL, Unix Shell scripts, Pyspark scripts, in a complex enterprise environment Experience in configuration management using Ansible/Jenkins/GIT Hands on cloud-based solution design, configuration and development experience with Azure and AWS Hands on experience of using AWS Services - S3,EC2, EMR, SNS, SQS, Lambda functions, Redshift Hands on experience Of building Data pipelines to ingest, transform on Databricks Delta Lake platform from a range of data sources - Data bases, Flat files, Streaming etc.. Knowledge of Data Modelling techniques and practices used for a Data Warehouse/Data Mart application. Quality engineering development experience (CI/CD – Jenkins, Docker) Experience in Terraform, Kubernetes and Docker Experience with Source Control Tools – Github or BitBucket Exposure to relational Databases - Oracle or MS SQL or DB2 (SQL/PLSQL, Database design, Normalisation, Execution plan analysis, Index creation and maintenance, Stored Procedures) , PostGres/MySQL Skilled in querying data from a range of data sources that store structured and unstructured data Knowledge or understanding of Power BI (Recommended) Key Accountabilities Design, develop, test, deploy, maintain and improve software Develop flowcharts, layouts and documentation to identify requirements & solutions Write well designed & high-quality testable code Produce specifications and determine operational feasibility Integrate software components into fully functional platform Apply pro-actively & perform hands-on design and implementation of best practice CI/CD Coaching & mentoring of other Service Team members Develop/contribute to software verification plans and quality assurance procedures Document and maintain software functionality Troubleshoot, debug and upgrade existing systems, including participating in DR tests Deploy programs and evaluate customer feedback Contribute to team estimation for delivery and expectation management for scope. Comply with industry standards and regulatory requirements
Posted 9 hours ago
0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Back End Developer/ Full Stack Developer Hands on experience working with insurance claims modeling, preferably P&C. Good Knowledge of insurance terminologies. Knowledge of statistical distributions and their application. Experience on various statistical modeling techniques (Stochastic Modeling, Monte Carlo Simulation, Regression, etc. Skilled in MS Office tools like, Excel, PowerPoint, etc. Basic knowledge of VBA, Python. Strong problem solving and communication skills. Positive attitude to execute work with quality and flexible with changing priorities.
Posted 9 hours ago
0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Front End - Web UI Developer Hands on experience working with insurance claims modeling, preferably P&C. Good Knowledge of insurance terminologies. Knowledge of statistical distributions and their application. Experience on various statistical modeling techniques (Stochastic Modeling, Monte Carlo Simulation, Regression, etc. Skilled in MS Office tools like, Excel, PowerPoint, etc. Basic knowledge of VBA, Python. Strong problem solving and communication skills. Positive attitude to execute work with quality and flexible with changing priorities.
Posted 9 hours ago
5.0 - 8.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
This position is for a Manager role in the Insurance Predictive Analytics domain for one of our client. The key responsibility will be to support the client's Insurance Analytics team from Gurgaon in undertaking various Analytics and Data Science projects. The Job Responsibilities Include The Following Clearly Setting Project Objectives with the Client – Take initiatives to identify opportunities and develop problem statements to be worked upon. Data Extraction, Cleansing and Manipulation – Handle large volume of data, research for variables and work with structured/unstructured data. Predictive Modelling – Development of models using appropriate predictive analytics techniques Model Documentation – Clear and Detailed Documentation of the modelling procedures. Participate in various other analytics projects and work on ad-hoc requests relating to data extractions and visualizations as per the client need Modeling / Data Science: Model training and development, Model monitoring, refitting models, diagnostics/analysis and publishing the model Model pipeline automation via airflow documentation Development dashboards containing model validation and monitoring KPIs We are also looking for a go getter person who can define: Analytic strategies to meet the demands of business requirements Technical requirements of the analytic solutions Data requirements of the analytic solution processes The person will be part of the Data Science team for a major Insurance client. He/ She will work with different stakeholders as SME for data science Engages in technical problem solving across multiple technologies; often needs to develop new solutions A typical work day will involve working with stakeholders in individual contributor A suitable candidate should have 5-8 years of experience in a similar role and should possess a go -getter attitude. He/ She should be able to deal with ambiguity Experience in insurance preferred but not mandatory. Personal Qualifications Deep understanding of Analytics Strong communicator and team player. Should have experience in client interactions. Experience in working in a multi-cultural environment Questions existing processes, suggests improvements Required Education And Skills Academic Background in Science (Mathematics, Physics, Engineering, Statistics, Economics, Actuarial Science etc.) Strong IT skills - Hands on experience in Machine Learning, Data Analysis and Data Preparation tools like Python, Pyspark, SQL, Airflow and good knowledge in advanced analytics / Machine learning / statistical / data mining / Text mining techniques in Regression or classification Experience working with generalized linear models and machine learning model development process is essential Working knowledge with AWS Services like S3, EC2, EMR, Airflow Working knowledge of Snowflake and Power BI Experience in P&C Insurance Industry is preferable
Posted 9 hours ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
40098 Jobs | Dublin
Wipro
19612 Jobs | Bengaluru
Accenture in India
17156 Jobs | Dublin 2
EY
15921 Jobs | London
Uplers
11674 Jobs | Ahmedabad
Amazon
10661 Jobs | Seattle,WA
Oracle
9470 Jobs | Redwood City
IBM
9401 Jobs | Armonk
Accenture services Pvt Ltd
8745 Jobs |
Capgemini
7998 Jobs | Paris,France