Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 - 6.0 years
0 Lacs
gwalior, madhya pradesh
On-site
As a Team Lead for a team of developers, you will be responsible for providing guidance, conducting code reviews, and offering mentorship to ensure the team's success. Your role will involve architecting, designing, developing, and maintaining web applications using PHP and relevant frameworks. Collaboration with project managers, designers, and QA teams is essential to deliver high-quality products within specified timelines. In addition to team management and project collaboration, you will be tasked with managing server configurations, deployments, backups, and ensuring application security and performance are up to industry standards. Your responsibilities will also include establishing coding standards and best practices to uphold code quality and reusability. Troubleshooting and debugging existing applications, as well as identifying areas for enhancement, will be part of your routine tasks. You should possess a Bachelor's degree in Computer Science or a related field, or equivalent experience, along with a minimum of 4 years of PHP development experience, with at least 2 years in a team lead or senior role. Proficiency in PHP, MySQL, and modern frameworks such as Laravel, CodeIgniter, Symfony, and Opencart is required. Strong familiarity with REST APIs, AJAX, and third-party integrations is essential, as well as a good grasp of front-end technologies like HTML5, CSS3, JavaScript, jQuery, and Bootstrap. Moreover, experience in server management (Linux, Apache/Nginx, VPS, cPanel, SSL, firewalls, etc.) and knowledge about Domain Book, Renew, and DNS activities are expected. Familiarity with cloud platforms like AWS, Digital Ocean, or similar would be advantageous. Proficiency in Git, version control systems, deployment tools, and strong problem-solving skills are necessary for this role. Your ability to work both independently and collaboratively within a team environment is crucial for the success of critical web applications with 24/7 uptime requirements.,
Posted 1 day ago
8.0 years
0 Lacs
Gujarat, India
On-site
Job Summary We are seeking a highly skilled and motivated Lead DevOps Engineer with Solution Architect expertise to manage end-to-end infrastructure projects across cloud, hybrid, and dedicated server environments. This role demands hands-on experience with WHM/cPanel, OpenPanel, load balancers , and deep knowledge of modern DevOps practices. The ideal candidate will also lead a team of DevOps engineers, drive technical excellence, and serve as the go-to expert for scalable, secure, and high-availability infrastructure solutions. Key Responsibilities : DevOps & Infrastructure Management Architect, implement, and maintain scalable infrastructure solutions across cloud and dedicated server environments. Manage hosting infrastructure including WHM/cPanel, OpenPanel, Apache/Nginx, MySQL, DNS, mail servers, and firewalls. Design and configure load balancing strategies using HAProxy, NGINX, or cloud-native load balancers. Automate provisioning, configuration, deployment, and monitoring using tools like Ansible, Terraform, CI/CD (Jenkins, GitLab CI). Ensure infrastructure reliability, security, and disaster recovery processes are in place. Solution Architecture Translate business and application requirements into robust infrastructure blueprints. Lead design reviews and architectural discussions for client and internal projects. Create documentation and define architectural best practices for hosting and DevOps. Team Management & Leadership Lead and mentor a team of DevOps engineers across multiple projects. Allocate resources, manage project timelines, and ensure successful delivery. Foster a culture of innovation, continuous improvement, and collaboration. Conduct performance reviews, provide training, and support career development of team members. Monitoring, Security & Optimization Set up and maintain observability systems (e.g., Prometheus, Grafana, Zabbix). Conduct performance tuning, cost optimization, and environment hardening. Ensure compliance with internal policies and external standards (ISO, GDPR, SOC2, etc.). Required Skills & Experience : 8+ years of experience in DevOps, systems engineering, or cloud infrastructure management. 3+ years of experience in team leadership or technical management. Proven expertise in hosting infrastructure, including WHM/cPanel, OpenPanel, Plesk, DNS, and mail configurations. Strong experience with Linux servers, networking, security, and automation scripting (Bash, Python). Hands-on experience with cloud platforms (AWS, Azure, GCP) and hybrid environments. Deep understanding of CI/CD pipelines, Docker/Kubernetes, and version control (Git). Familiarity with load balancing, high availability, and failover strategies. Preferred Qualifications : Certifications such as AWS Solutions Architect, RHCE, CKA, or Linux Foundation Certified Engineer. Experience in IT services or hosting/cloud consulting environments. Knowledge of compliance frameworks (e.g., ISO 27001, SOC 2, PCI-DSS). Familiarity with agile methodologies and DevOps lifecycle management tools.
Posted 1 day ago
170.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Summary Strong hands on developer / devops for Credit Grading Hive in PE. This is for a #1 priority for the FFG program and a strong engineering talent is required to drive the rebuild of CreditMate legacy platform. The skillset requires is to complete overhaul and develop an inhouse solution in latest technology stack The person will be part of the team developing new CreditMate aligned with CC wide Unified UI / UX strategy. Key Responsibilities Strategy Advice future technology capabilities and architecture design considering business objectives, technology strategy, trends and regulatory requirements Awareness and understanding of the Group’s business strategy and model appropriate to the role. Business Awareness and understanding of the wider business, economic and market environment in which the Group operates. Understand and Recommend business flows and translate them to API Ecosystem Processes Responsible for executing and supervising microservices development to facilitate business capabilities and orchestrate to achieve business outcomes People & Talent Lead through example and build the appropriate culture and values. Set appropriate tone and expectations from their team and work in collaboration with risk and control partners. Ensure the provision of ongoing training and development of people, and ensure that holders of all critical functions are suitably skilled and qualified for their roles ensuring that they have effective supervision in place to mitigate any risks. Risk Management The ability to interpret the Portfolio Key Risks, identify key issues based on this information and put in place appropriate controls and measures Governance Awareness and understanding of the regulatory framework, in which the Group operates, and the regulatory requirements and expectations relevant to the rol Regulatory & Business Conduct Display exemplary conduct and live by the Group’s Values and Code of Conduct. Take personal responsibility for embedding the highest standards of ethics, including regulatory and business conduct, across Standard Chartered Bank. This includes understanding and ensuring compliance with, in letter and spirit, all applicable laws, regulations, guidelines and the Group Code of Conduct. Effectively and collaboratively identify, escalate, mitigate and resolve risk, conduct and compliance matters. Lead to achieve the outcomes set out in the Bank’s Conduct Principles: [Fair Outcomes for Clients; Effective Financial Markets; Financial Crime Compliance; The Right Environment.] Serve as a Director of the Board Exercise authorities delegated by the Board of Directors and act in accordance with Articles of Association (or equivalent) Key stakeholders Product Owners, Hive Leads, Client Coverage Tech and Biz Stakeholders Qualifications Education-Computer science it btech Certifications-Java, kubernetes, Languages-Java, quarkus, spring, sql, python Skills And Experience Participates in development of multiple or large software products and estimates and monitors development costs based on functional and technical requirements. Delivery Experience as Tech Project manager and analysis skills Contrasts advantages and drawbacks of different development languages and tools. Expertise in RDBMS solutions (Oracle, PostgreSQL) & NoSQL offerings (Cassandra, MongoDB, etc) Experience in distributed technologies e.g. Kafka, Apache MQ, RabbitMQ etc. will be added advantage Strong knowledge in application integration using Web Service (SOAP/REST/GRPC) or Messaging using JMS. About Standard Chartered We're an international bank, nimble enough to act, big enough for impact. For more than 170 years, we've worked to make a positive difference for our clients, communities, and each other. We question the status quo, love a challenge and enjoy finding new opportunities to grow and do better than before. If you're looking for a career with purpose and you want to work for a bank making a difference, we want to hear from you. You can count on us to celebrate your unique talents and we can't wait to see the talents you can bring us. Our purpose, to drive commerce and prosperity through our unique diversity, together with our brand promise, to be here for good are achieved by how we each live our valued behaviours. When you work with us, you'll see how we value difference and advocate inclusion. Together We Do the right thing and are assertive, challenge one another, and live with integrity, while putting the client at the heart of what we do Never settle, continuously striving to improve and innovate, keeping things simple and learning from doing well, and not so well Are better together, we can be ourselves, be inclusive, see more good in others, and work collectively to build for the long term What We Offer In line with our Fair Pay Charter, we offer a competitive salary and benefits to support your mental, physical, financial and social wellbeing. Core bank funding for retirement savings, medical and life insurance, with flexible and voluntary benefits available in some locations. Time-off including annual leave, parental/maternity (20 weeks), sabbatical (12 months maximum) and volunteering leave (3 days), along with minimum global standards for annual and public holiday, which is combined to 30 days minimum. Flexible working options based around home and office locations, with flexible working patterns. Proactive wellbeing support through Unmind, a market-leading digital wellbeing platform, development courses for resilience and other human skills, global Employee Assistance Programme, sick leave, mental health first-aiders and all sorts of self-help toolkits A continuous learning culture to support your growth, with opportunities to reskill and upskill and access to physical, virtual and digital learning. Being part of an inclusive and values driven organisation, one that embraces and celebrates our unique diversity, across our teams, business functions and geographies - everyone feels respected and can realise their full potential.
Posted 1 day ago
3.0 years
0 Lacs
Bhubaneswar, Odisha, India
On-site
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Python (Programming Language), Apache Airflow Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities, participate in team meetings, and contribute to the overall success of projects by leveraging your expertise in application development. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Mentor junior team members to enhance their skills and knowledge. - Continuously evaluate and improve application performance and user experience. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform. - Good To Have Skills: Experience with Apache Airflow, Python (Programming Language). - Strong understanding of data integration and ETL processes. - Experience with cloud-based data solutions and architectures. - Familiarity with data governance and management best practices. Additional Information: - The candidate should have minimum 5 years of experience in Databricks Unified Data Analytics Platform. - This position is based at our Kolkata office. - A 15 years full time education is required., 15 years full time education
Posted 1 day ago
3.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Apache Spark Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing application features, and ensuring that the applications function seamlessly within the business environment. You will also engage in testing and troubleshooting to enhance application performance and user experience, while continuously seeking ways to improve processes and solutions. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Assist in the documentation of application processes and workflows. - Engage in code reviews to ensure quality and adherence to best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Apache Spark. - Strong understanding of distributed computing principles. - Experience with data processing frameworks and tools. - Familiarity with programming languages such as Java or Scala. - Knowledge of cloud platforms and services for application deployment. Additional Information: - The candidate should have minimum 3 years of experience in Apache Spark. - This position is based at our Noida office. - A 15 years full time education is required., 15 years full time education
Posted 1 day ago
7.5 years
0 Lacs
Navi Mumbai, Maharashtra, India
On-site
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Databricks Unified Data Analytics Platform, Informatica Intelligent Cloud Services Good to have skills : NA Minimum 7.5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, facilitating discussions to address challenges, and guiding your team through the development process while maintaining a focus on quality and efficiency. You will also engage in strategic planning to align application development with organizational goals, ensuring that all stakeholders are informed and involved throughout the project lifecycle. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Facilitate training and development opportunities for team members to enhance their skills. - Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform, Informatica Intelligent Cloud Services. - Good To Have Skills: Experience with cloud-based data integration tools. - Strong understanding of data engineering principles and practices. - Experience with big data technologies such as Apache Spark and Hadoop. - Familiarity with data governance and data quality frameworks. Additional Information: - The candidate should have minimum 7.5 years of experience in Databricks Unified Data Analytics Platform. - This position is based in Mumbai. - A 15 years full time education is required.
Posted 1 day ago
0 years
0 Lacs
India
Remote
CSQ326R35 Mission The AI Forward Deployed Engineering (AI FDE) team is a highly specialized customer-facing AI team at Databricks. We deliver professional services engagements to help our customers build and productionize first-of-its-kind AI applications. We work cross-functionally to shape long-term strategic priorities and initiatives alongside engineering, product, and developer relations, as well as support internal subject matter expert (SME) teams. We view our team as an ensemble: we look for individuals with strong, unique specializations to improve the overall strength of the team. This team is the right fit for you if you love working with customers, teammates, and fueling your curiosity for the latest trends in GenAI, LLMOps, and ML more broadly. This role can be remote. The Impact You Will Have Develop cutting-edge GenAI solutions, incorporating the latest techniques from our Mosaic AI research to solve customer problems Own production rollouts of consumer and internally facing GenAI applications Serve as a trusted technical advisor to customers across a variety of domains Present at conferences such as Data + AI Summit, recognized as a thought leader internally and externally Collaborate cross-functionally with the product and engineering teams to influence priorities and shape the product roadmap What We Look For Experience building GenAI applications, including RAG, multi-agent systems, Text2SQL, fine-tuning, etc., with tools such as HuggingFace, LangChain, and DSPy Expertise in deploying production-grade GenAI applications, including evaluation and optimizations Extensive years of hands-on industry data science experience, leveraging common machine learning and data science tools, i.e. pandas, scikit-learn, PyTorch, etc. Experience building production-grade machine learning deployments on AWS, Azure, or GCP Graduate degree in a quantitative discipline (Computer Science, Engineering, Statistics, Operations Research, etc.) or equivalent practical experience Experience communicating and/or teaching technical concepts to non-technical and technical audiences alike Passion for collaboration, life-long learning, and driving business value through AI [Preferred] Experience using the Databricks Intelligence Platform and Apache Spark™ to process large-scale distributed datasets About Databricks Databricks is the data and AI company. More than 10,000 organizations worldwide — including Comcast, Condé Nast, Grammarly, and over 50% of the Fortune 500 — rely on the Databricks Data Intelligence Platform to unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark™, Delta Lake and MLflow. To learn more, follow Databricks on Twitter, LinkedIn and Facebook. Benefits At Databricks, we strive to provide comprehensive benefits and perks that meet the needs of all of our employees. For specific details on the benefits offered in your region, please visit https://www.mybenefitsnow.com/databricks. Our Commitment to Diversity and Inclusion At Databricks, we are committed to fostering a diverse and inclusive culture where everyone can excel. We take great care to ensure that our hiring practices are inclusive and meet equal employment opportunity standards. Individuals looking for employment at Databricks are considered without regard to age, color, disability, ethnicity, family or marital status, gender identity or expression, language, national origin, physical and mental ability, political affiliation, race, religion, sexual orientation, socio-economic status, veteran status, and other protected characteristics. Compliance If access to export-controlled technology or source code is required for performance of job duties, it is within Employer's discretion whether to apply for a U.S. government license for such positions, and Employer may decline to proceed with an applicant on this basis alone.
Posted 1 day ago
5.0 - 7.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
FactSet creates flexible, open data and software solutions for over 200,000 investment professionals worldwide, providing instant access to financial data and analytics that investors use to make crucial decisions. At FactSet, our values are the foundation of everything we do. They express how we act and operate, serve as a compass in our decision-making, and play a big role in how we treat each other, our clients, and our communities. We believe that the best ideas can come from anyone, anywhere, at any time, and that curiosity is the key to anticipating our clients’ needs and exceeding their expectations. Senior Software Engineer Group Description Data Solutions - Platforms and Environments is the industry leading content delivery platform. Clients seamlessly access organized and connected content that is easily discoverable, explorable, and procured via the FactSet Marketplace. Data is delivered via a variety of technologies and formats that meet the needs of our clients’ workflows. By enabling our clients to utilize their preferred choice of industry standard databases, programing languages, and data visualization tools, we empower them to focus on the core competencies needed to drive their business. The Data Solutions - Platforms and Environments solutions portfolio includes Standard DataFeed, Data Exploration, OnDemand (API), Views, Cornerstone, Exchange DataFeed, Benchmark Feeds, the Open:FactSet Marketplace, DataDictionary , Navigator and other non-workstation initiatives. Job Description The Data Solutions - Platforms and Environments team is looking for a talented, highly motivated Senior Software Engineer (Full Stack Engineer) to join our Navigator Application initiatives, an important part of one of FactSet’s highest profile and most strategic areas of investment and development. As the Full Stack Senior Software Engineer , you will design and develop Application Development including UI , API , Database frameworks and data engineering pipelines, help implement improvements to existing pipelines and infrastructure and provide production support. You will be collaborating closely with Product Developer/Business Analyst for capturing technical requirements. FactSet is happy to setup an information session with an Engineer working on this product to talk about the product, team and the interview process. What You’II Do Implement new components and Application Features for Client facing application as a Full Stack Developer. Maintain and resolve bugs in existing components Contribute new features, fixes, and refactors to the existing code Perform code reviews and coach engineers with respect to best practices Work with other engineers in following the test-driven methodology in an agile environment Collaborate with other engineers and Product Developers in a Scrum Agile environment using Jira and Confluence Ability to work as part of a geographically diverse team Ability to create and review documentation and test plans Estimate task sizes and regularly communicate progress in daily standups and biweekly Scrum meetings Coordinate with other teams across offices and departments What We’re Looking For Bachelor’s degree in Engineering or relevant field required. 5 to 7 years of relevant experience Expert level proficiency writing and optimizing code in Python. Proficient in frontend technologies such as Vue.js (preferred) or ReactJS and experience with JavaScript, CSS, HTML . Good knowledge of REST API Development, preferably Python Flask, Open API Good knowledge of Relational databases, preferably with MSSQL or Postgres Good Knowledge of GenAI and Vector Databases is a plus Good understanding of general database design and architecture principles A realistic, pragmatic approach. Can deliver functional prototypes that can be enhanced & optimized in later phases Strong written and verbal communication skills Working experience on AWS services, Lambda, EC2, S3, AWS Glue etc. Strong Working experience with any container / PAAS technology (Docker or Heroku) ETL and Data pipelines experience a plus. Working experience of Apache Spark, Apache Airflow, GraphQL, is a plus Experience in developing event driven distributed serverless Infrastructure (AWS-Lambda), SNS-SQS is a plus. Must be a Voracious Learner. What's In It For You At FactSet, our people are our greatest asset, and our culture is our biggest competitive advantage. Being a FactSetter means: The opportunity to join an S&P 500 company with over 45 years of sustainable growth powered by the entrepreneurial spirit of a start-up. Support for your total well-being. This includes health, life, and disability insurance, as well as retirement savings plans and a discounted employee stock purchase program, plus paid time off for holidays, family leave, and company-wide wellness days. Flexible work accommodations. We value work/life harmony and offer our employees a range of accommodations to help them achieve success both at work and in their personal lives. A global community dedicated to volunteerism and sustainability, where collaboration is always encouraged, and individuality drives solutions. Career progression planning with dedicated time each month for learning and development. Business Resource Groups open to all employees that serve as a catalyst for connection, growth, and belonging. Learn More About Our Benefits Here. Salary is just one component of our compensation package and is based on several factors including but not limited to education, work experience, and certifications. Company Overview FactSet (NYSE:FDS | NASDAQ:FDS) helps the financial community to see more, think bigger, and work better. Our digital platform and enterprise solutions deliver financial data, analytics, and open technology to more than 8,200 global clients, including over 200,000 individual users. Clients across the buy-side and sell-side, as well as wealth managers, private equity firms, and corporations, achieve more every day with our comprehensive and connected content, flexible next-generation workflow solutions, and client-centric specialized support. As a member of the S&P 500, we are committed to sustainable growth and have been recognized among the Best Places to Work in 2023 by Glassdoor as a Glassdoor Employees’ Choice Award winner. Learn more at www.factset.com and follow us on X and LinkedIn. At FactSet, we celebrate difference of thought, experience, and perspective. Qualified applicants will be considered for employment without regard to characteristics protected by law.
Posted 1 day ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
Remote
Key Responsibilities: Design and develop high-performance backend services using Java (18/21) and Spring Boot Build scalable and distributed data pipelines using Apache Spark Develop and maintain microservices-based architectures Work on cloud-native deployments, preferably on AWS (EC2, S3, EMR, Lambda, etc.) Optimize data processing systems for performance, scalability, and reliability Collaborate with data engineers, architects, and product managers to translate business requirements into technical solutions Ensure code quality through unit testing, integration testing, and code reviews Troubleshoot and resolve issues in production and non-production environments Required Skills and Experience: 5+ years of professional experience in software engineering Strong programming expertise in Core Java (18/21) Hands-on experience with Apache Spark and distributed data processing Proven experience with Spring Boot and RESTful API development Solid understanding of microservices architecture and patterns Proficiency in cloud platforms, especially AWS (preferred) Experience with SQL/NoSQL databases and data lake/storage systems Familiarity with CI/CD tools and containerization (Docker/Kubernetes is a plus) What We Offer: - We offer a market-leading salary along with a comprehensive benefits package to support your well-being. -Enjoy a hybrid or remote work setup that prioritizes work-life balance and personal well-being. -We invest in your career through continuous learning and internal growth opportunities. -Be part of a dynamic, inclusive, and vibrant workplace where your contributions are recognized and rewarded. -We believe in straightforward policies, open communication, and a supportive work environment where everyone thrives. About the Company: https://predigle.com/ https://www.espergroup.com/ Predigle, an EsperGroup company, focuses on building disruptive technology platforms to transform daily business operations. Predigle has expanded rapidly to offer various products and services. Predigle Intelligence (Pi) is a comprehensive portable AI platform that offers a low-code/no-code AI design solution for solving business problems.
Posted 1 day ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
Gurgaon/Bangalore, India The Test Lead will be responsible for driving quality engineering strategies, processes, and standards for company’s diverse suite of business applications, ensuring they meet the high-quality standards. This role involves collaborating with cross-functional teams consisting of both AXA XL staff and vendors.. Additionally, they will promote the adoption of automation tools and best practices to optimize testing processes. This role requires excellent stakeholder management, technical expertise in test automation, and a keen focus on delivering high-quality, reliable software releases across complex insurance platforms The role holder will also demonstrate a best-in-class level of expertise in test management and testing best practices. What you’ll be DOING What will your essential responsibilities include? Develop and implement comprehensive testing strategies that align with organizational goals and industry best practices Take responsibility for all types of testing performed by the testing teams including functional and non-functional as well as user acceptance testing, when required. Collaborate with our testing vendor partners as well as our AXA XL delivery team members to make sure comprehensive testing coverage and the timely delivery of software changes/enhancements Act as the primary point of contact for all testing-related inquiries, escalations and coordination with AXA XL stakeholders. Coordinate with project managers, business analysts, and development teams to define UAT scope and objectives, particularly in relation to property and casualty insurance products. Facilitate UAT testing, including training end-users and gathering feedback on the product and coordinate the UAT Cycle ensuring its timely completion Work with the TCoE team to understand best practices and effectively implement them on the applications to achieve our expected quality results. Oversee the planning, execution, and reporting of end-to-end system integration testing, ensuring thorough validation of software applications. Drive test automation initiatives, including in-sprint and intelligent testing, automation frameworks to improve coverage and reduce test cycle completion time Review testing artifacts created by testing teams such as test strategy, test plan, test summary report etc. to make sure they meet industry standards. Identify the opportunities to improve operational effectiveness and efficiency using automation , virtualization and integration with CI/CD pipeline Identify, communicate & track testing risks and issues then help develop mitigation plans to bring them to closure Help estimate new testing work requests and manage estimates against actuals to make sure change controls are appropriately managed Define, collect, and analyze key performance indicators (KPIs) & metrics to evaluate testing effectiveness and drive improvements. In collaboration with Procurement, manage RFI/RFPs, contract negotiations and delivery of contract terms and validate them against the established KPI/SLAs Oversee the QA Budget and make sure the high quality products are delivered within the budget What You Will BRING We’re looking for someone who has these abilities and skills: Bachelor’s degree in computer science, Information Technology, or a related field. robust understanding of software development methodologies, including agile Proven track record managing complex QE engagements with multiple teams,vendors, and releases Effective knowledge of the various types of software testing - static, smoke, system, system integration, regression, UAT, performance, compatibility etc. Effective hands-on technical background in test automation across UI,API and performance testing Extensive experience in designing and development of test automation frameworks. Understanding of cloud platforms (e.g., AWS, Azure, Google Cloud) and microservices architecture. Experience with testing tools such as Selenium, Apache JMeter, Gatling, UFT & Performance Center Well versed with latest test automation tools and technologies with an aim for continuous improvement Proven leadership and team management skills, with the ability to motivate and guide diverse teams. Excellent interpersonal and communication skills to effectively collaborate with both technical and non-technical stakeholders. Experience with property & casualty insurance lines of business and products will be preferred Experience in implementing the GenAI based solutions to optimize testing processes(Good to have) You will report to TCoE Lead Who WE Are AXA XL, the P&C and specialty risk division of AXA, is known for solving complex risks. For mid-sized companies, multinationals and even some inspirational individuals we don’t just provide re/insurance, we reinvent it. How? By combining a comprehensive and efficient capital platform, data-driven insights, leading technology, and the best talent in an agile and inclusive workspace, empowered to deliver top client service across all our lines of business − property, casualty, professional, financial lines and specialty. With an innovative and flexible approach to risk solutions, we partner with those who move the world forward. Learn more at axaxl.com What We OFFER Inclusion AXA XL is committed to equal employment opportunity and will consider applicants regardless of gender, sexual orientation, age, ethnicity and origins, marital status, religion, disability, or any other protected characteristic. At AXA XL, we know that an inclusive culture and a diverse workforce enable business growth and are critical to our success. That’s why we have made a strategic commitment to attract, develop, advance and retain the most diverse workforce possible, and create an inclusive culture where everyone can bring their full selves to work and can reach their highest potential. It’s about helping one another — and our business — to move forward and succeed. Five Business Resource Groups focused on gender, LGBTQ+, ethnicity and origins, disability and inclusion with 20 Chapters around the globe Robust support for Flexible Working Arrangements Enhanced family friendly leave benefits Named to the Diversity Best Practices Index Signatory to the UK Women in Finance Charter Learn more at axaxl.com/about-us/inclusion-and-diversity. AXA XL is an Equal Opportunity Employer. Total Rewards AXA XL’s Reward program is designed to take care of what matters most to you, covering the full picture of your health, wellbeing, lifestyle and financial security. It provides dynamic compensation and personalized, inclusive benefits that evolve as you do. We’re committed to rewarding your contribution for the long term, so you can be your best self today and look forward to the future with confidence. Sustainability At AXA XL, Sustainability is integral to our business strategy. In an ever-changing world, AXA XL protects what matters most for our clients and communities. We know that sustainability is at the root of a more resilient future. Our 2023-26 Sustainability strategy, called “Roots of resilience”, focuses on protecting natural ecosystems, addressing climate change, and embedding sustainable practices across our operations. Our Pillars Valuing nature: How we impact nature affects how nature impacts us. Resilient ecosystems - the foundation of a sustainable planet and society - are essential to our future. We’re committed to protecting and restoring nature - from mangrove forests to the bees in our backyard - by increasing biodiversity awareness and inspiring clients and colleagues to put nature at the heart of their plans. Addressing climate change: The effects of a changing climate are far reaching and significant. Unpredictable weather, increasing temperatures, and rising sea levels cause both social inequalities and environmental disruption. We're building a net zero strategy, developing insurance products and services, and mobilizing to advance thought leadership and investment in societal-led solutions. Integrating ESG: All companies have a role to play in building a more resilient future. Incorporating ESG considerations into our internal processes and practices builds resilience from the roots of our business. We’re training our colleagues, engaging our external partners, and evolving our sustainability governance and reporting. AXA Hearts in Action: We have established volunteering and charitable giving programs to help colleagues support causes that matter most to them, known as AXA XL’s “Hearts in Action” programs. These include our Matching Gifts program, Volunteering Leave, and our annual volunteering day - the Global Day of Giving. For more information, please see axaxl.com/sustainability
Posted 1 day ago
0 years
0 Lacs
Pune/Pimpri-Chinchwad Area
On-site
Job Description TRAINING DEVELOPER – NIQ, CHENNAI/Pune, India About This Job As Training Developer, you will be responsible for designing, developing, managing, organizing and conducting technical workshops, and evaluation of workshop effectiveness. Responsibilities Responsible to Plan, design & create (or curate) the training materials on company products, processes, and technologies Orient trainers about the content and scope of the training Deliver trainings when required Updates the content on an on-going basis Analyzes course evaluations to judge effectiveness of training sessions and to implement suggestions for improvement Interpret data to judge progress and cost-effectiveness Help the Tech Competency Leader to evaluate external learning vendors Qualifications A LITTLE BIT ABOUT YOU Proven experience as a Tech expert (2 yrs) willing to learn training skills Proven ability to complete full training cycle (assess needs, plan, develop, coordinate, monitor and evaluate) willing to learn tech skills Proven track record in diverse technical competencies - Java Full stack or Data Engineering with Python/ Pyspark /Databricks or Data Science, AI/ML track or Data and Business Intelligence stack as trainer/developer. Working knowledge of Apache Airflow is an added advantage Besides proficiency with M365 Platform and well-versed on the various LMS platforms available in the market Excellent verbal and written communicators Facilitation / Organization skills Qualifications Degree at university level, preferable in technical/educational related area Train The Trainer certification is a plus Additional Information Enjoy a flexible and rewarding work environment with peer-to-peer recognition platforms. Recharge and revitalize with help of wellness plans made for you and your family. Plan your future with financial wellness tools. Stay relevant and upskill yourself with career development opportunities Our Benefits Flexible working environment Volunteer time off LinkedIn Learning Employee-Assistance-Program (EAP) About NIQ NIQ is the world’s leading consumer intelligence company, delivering the most complete understanding of consumer buying behavior and revealing new pathways to growth. In 2023, NIQ combined with GfK, bringing together the two industry leaders with unparalleled global reach. With a holistic retail read and the most comprehensive consumer insights—delivered with advanced analytics through state-of-the-art platforms—NIQ delivers the Full View™. NIQ is an Advent International portfolio company with operations in 100+ markets, covering more than 90% of the world’s population. For more information, visit NIQ.com Want to keep up with our latest updates? Follow us on: LinkedIn | Instagram | Twitter | Facebook Our commitment to Diversity, Equity, and Inclusion NIQ is committed to reflecting the diversity of the clients, communities, and markets we measure within our own workforce. We exist to count everyone and are on a mission to systematically embed inclusion and diversity into all aspects of our workforce, measurement, and products. We enthusiastically invite candidates who share that mission to join us. We are proud to be an Equal Opportunity/Affirmative Action-Employer, making decisions without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability status, age, marital status, protected veteran status or any other protected class. Our global non-discrimination policy covers these protected classes in every market in which we do business worldwide. Learn more about how we are driving diversity and inclusion in everything we do by visiting the NIQ News Center: https://nielseniq.com/global/en/news-center/diversity-inclusion
Posted 1 day ago
3.0 years
0 Lacs
India
On-site
Lucidworks is leading digital transformation for some of the world's biggest retailers, financial services firms, manufacturers, and B2B commerce organizations. We believe that the core to a great digital experience starts with search and browse. Our Deep Learning technology captures user behavior and utilizes machine learning to connect people with the products, content, and information they need. Brands including American Airlines, Lenovo, Red Hat, and Cisco Systems rely on Lucidworks' suite of products to power commerce, customer service, and workplace applications that delight customers and empower employees. Lucidworks believes in the power of diversity and inclusion to help us do our best work. We are an Equal Opportunity employer and welcome talent across a full range of backgrounds, orientation, origin, and identity in an inclusive and non-discriminatory way. About the Team The technical support team leverages their extensive experience supporting large-scale Solr clusters and the Lucene/Solr ecosystem. Their day might include troubleshooting errors and attempting to fix or develop workarounds, diagnosing network and environmental issues, learning your customer's infrastructure and technologies, as well as reproducing bugs and opening Jira tickets for the engineering team. Their primary tasks are break/fix scenarios where the diagnostics quickly bring network assets back online and prevent future problems--which has a huge impact on our customers’ business. About the Role As a Search Engineer in Technical Support, you will play a critical role in helping our clients achieve success with our products. You will be responsible for assisting clients directly in resolving any technical issues they encounter, as well as answering questions about the product and feature functionality. You will work closely with internal teams such as Engineering and Customer Success to resolve a variety of issues, including product defects, performance issues, and feature requests. This role requires excellent problem-solving skills and attention to detail, strong communication abilities, and a deep understanding of search technology. Additionally, this role requires the ability to work independently and as part of a team, and being comfortable working with both technical and non-technical stakeholders. The successful candidate will demonstrate a passion for delivering an outstanding customer experience, balancing technical expertise with empathy for the customer’s needs. This role is open to candidates in India. The role expected to participate in weekend on-call rotations. Responsibilities Field incoming questions, help users configure Lucidworks Fusion and its components, and help them to understand how to use the features of the product Troubleshoot complex search issues in and around Lucene/Solr Document solutions into knowledge base articles for use by our customer base in our knowledge center Identify opportunities to provide customers with additional value through follow-on products and/or services Communicate high-value use cases and customer feedback to our Product Development and Engineering teams Collaborate across teams internally to diagnose and resolve critical issues Participating in a 24/7/365 on-call rotation, which includes weekends and holidays shifts Skills & Qualifications 3+ years of hands-on experience with Lucene/Solr or other search technologies is required BS or higher in Engineering or Computer Science is preferred 3+ years professional experience in a customer facing level 2-3 tech support role Experience with technical support CRM systems (Salesforce, Zendesk etc.) Ability to clearly communicate with customers by email and phone Proficiency with Java and one or more common scripting languages (Python, Perl, Ruby, etc.) Proficiency with Unix/Linux systems (command line navigation, file system permissions, system logs and administration, scripting, networking, etc.) Exposure to other related open source projects (Mahout, Hadoop, Tika, etc.) and commercial search technologies Enterprise Search, eCommerce, and/or Business Intelligence experience Knowledge of data science and machine learning concepts Experience with cloud computing platforms (GCP, Azure, AWS, etc.) and Kubernetes Startup experience is preferred Our Stack Apache Lucene/Solr, ZooKeeper, Spark, Pulsar, Kafka, Grafana Java, Python, Linux, Kubernetes Zendesk, Jira
Posted 1 day ago
0 years
0 Lacs
India
On-site
The ideal candidate will be responsible for developing high-quality applications. They will also be responsible for designing and implementing testable and scalable code. Responsibilities: Lead backend Python development for innovative healthcare technology solutions Oversee a backend team to achieve product and platform goals in the B2B HealthTech domain Design and implement scalable backend infrastructures with seamless API integration Ensure availability on immediate or short notice for efficient onboarding and project ramp-up Optimize existing backend systems based on real-time healthcare data requirements Collaborate with cross-functional teams to ensure alignment between tech and business goals Review and refine code for quality, scalability, and performance improvements Ideal Candidate: Experienced in building B2B software products using agile methodologies Strong proficiency in Python, with a deep understanding of backend system architecture Comfortable with fast-paced environments and quick onboarding cycles Strong communicator who fosters a culture of innovation, ownership, and collaboration Passionate about driving real-world healthcare impact through technology Skills Required: Primary: TypeScript, AWS, Python, RESTful APIs, Backend Architecture Additional: SQL/NoSQL databases, Docker/Kubernetes (preferred) Strongly Good to Have: Prior experience in Data Engineering , especially in healthcare or real-time analytics Familiarity with ETL pipelines , data lake/warehouse solutions , and stream processing frameworks (e.g., Apache Kafka, Spark, Airflow) Understanding of data privacy, compliance (e.g., HIPAA) , and secure data handling practices Hiring Process Profile Shortlisting Tech Interview Tech Interview Culture Fit
Posted 1 day ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
Remote
At Seismic, we're proud of our engineering culture where technical excellence and innovation drive everything we do. We're a remote-first data engineering team responsible for the critical data pipeline that powers insights for over 2,300 customers worldwide. Our team manages all data ingestion processes, leveraging technologies like Apache Kafka, Spark, various C# microservices services, and a shift-left data mesh architecture to transform diverse data streams into the valuable reporting models that our customers rely on daily to make data-driven decisions. Additionally, we're evolving our analytics platform to include AI-powered agentic workflows. Who You Are Have working knowledge of one OO language, preferably C#, but won’t hold your Java expertise against you (you’re the type of person who’s interested in learning and becoming an expert at new things). Additionally, we’ve been using Python more and more, and bonus points if you’re familiar with Scala. Have experience with architecturally complex distributed systems. Highly focused on operational excellence and quality – you have a passion to write clean and well tested code and believe in the testing pyramid. Outstanding verbal and written communication skills with the ability to work with others at all levels, effective at working with geographically remote and culturally diverse teams. You enjoy solving challenging problems, all while having a blast with equally passionate team members. Conversant in AI engineering. You’ve been experimenting with building ai solutions/integrations using LLMs, prompts, Copilots, Agentic ReAct workflows, etc. At Seismic, we’re committed to providing benefits and perks for the whole self. To explore our benefits available in each country, please visit the Global Benefits page. Please be aware we have noticed an increase in hiring scams potentially targeting Seismic candidates. Read our full statement on our Careers page. Seismic is the global leader in AI-powered enablement, empowering go-to-market leaders to drive strategic growth and deliver exceptional customer experiences at scale. The Seismic Enablement Cloud™ is the only unified AI-powered platform that prepares customer-facing teams with the skills, content, tools, and insights needed to maximize every buyer interaction and strengthen client relationships. Trusted by more than 2,000 organizations worldwide, Seismic helps businesses achieve measurable outcomes and accelerate revenue growth. Seismic is headquartered in San Diego with offices across North America, Europe, Asia and Australia. Learn more at seismic.com. Seismic is committed to building an inclusive workplace that ignites growth for our employees and creates a culture of belonging that allows all employees to be seen and valued for who they are. Learn more about DEI at Seismic here. Collaborating with experienced software engineers, data scientists and product managers to rapidly build, test, and deploy code to create innovative solutions and add value to our customers' experience. Building large scale platform infrastructure and REST APIs serving machine learning driven content recommendations to Seismic products. Leveraging the power of context in third-party applications such as CRMs to drive machine learning algorithms and models. Helping build next-gen Agentic tooling for reporting and insights Processing large amounts of internal and external system data for analytics, caching, modeling and more. Identifying performance bottlenecks and implementing solutions for them. Participating in code reviews, system design reviews, agile ceremonies, bug triage and on-call rotations. BS or MS in Computer Science, similar technical field of study, or equivalent practical experience. 3+ years of software development experience within a SaaS business. Must have a familiarity with .NET Core, and C# and frameworks. Experience in data engineering - building and managing Data Pipelines, ETL processes, and familiarity with various technologies that drive them: Kafka, FiveTran (Optional), Spark/Scala (Optional), etc. Data warehouse experience with Snowflake, or similar (AWS Redshift, Apache Iceberg, Clickhouse, etc). Familiarity with RESTFul microservice-based APIs Experience in modern CI/CD pipelines and infrastructure (Jenkins, Github Actions, Terraform, Kubernetes) a big plu (or equivalent) Experience with the SCRUM and the AGILE development process. Familiarity developing in cloud-based environments Optional: Experience with 3rd party integrations Optional: familiarity with Meeting systems like Zoom, WebEx, MS Teams Optional: familiarity with CRM systems like Salesforce, Microsoft Dynamics 365, Hubspot. If you are an individual with a disability and would like to request a reasonable accommodation as part of the application or recruiting process, please click here. Headquartered in San Diego and with employees across the globe, Seismic is the global leader in sales enablement , backed by firms such as Permira, Ameriprise Financial, EDBI, Lightspeed Venture Partners, and T. Rowe Price. Seismic also expanded its team and product portfolio with the strategic acquisitions of SAVO, Percolate, Grapevine6, and Lessonly. Our board of directors is composed of several industry luminaries including John Thompson, former Chairman of the Board for Microsoft. Seismic is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to gender, age, race, religion, or any other classification which is protected by applicable law. Please note this job description is not designed to cover or contain a comprehensive listing of activities, duties or responsibilities that are required of the employee for this job. Duties, responsibilities and activities may change at any time with or without notice.
Posted 1 day ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Senior Software Engineer Java Full Stack Overview The Franchise and Legal Solutions (FLS) Program contributes to the future of Mastercard by enabling growth in products, markets, services and innovations in an increasingly complex and competitive environment, while protecting our brand and business. We support, enable, and protect the franchise agreement by implementing the rules of the road for anyone interacting with our business. Our teams design, develop, and maintain comprehensive business solutions that connect the dots across different regions and functions. Our deep technology portfolio offers ample room for engineers to use innovation as a driver of strategy. Technologists working with us will grow in Agile, architecture, design, engineering, and strategy. Franchise and Legal Solutions team is looking for a Senior Software Development Engineer who can develop applications using Angular , React, java script OR Java J2EE stack. The ideal candidate is the one who is passionate about designing & developing high quality code which is highly scalable, operable & highly available. If you are the one who enjoys solving problems in a challenging environment, and who has the desire to take their career to the next level. If any of these opportunities excite you, we would love to talk. Role Develop application logic for a multi-component system of applications. Significant code development, code review and day-to-day support duties. Ensure product deliverable is highly performant, responsive and of high quality. Deliver completed code on time and with minimal to no defects or failures. Ensure all new logic maintains current unit test coverage standards or higher. Help maintain code quality and enable automation. Support testing resources as needed to remediate defects, answer questions, and assist with automation tasks. Contribute to the development of applications, components, system to system interfaces, and complete software solutions. Actively participate in the agile ceremonies including Daily Scrum, Story Grooming, Sprint Planning, and Retrospectives Participate in team prioritization discussions with Product/Business stakeholders Estimate and own delivery tasks (design, dev, test, deployment, configuration, documentation) to meet the business requirements Automate build, operate, and run aspects of software Drive integration of services focusing on customer journey and experience Perform demos/acceptance discussions in interacting with Product owners Drive adoption of technology standards and opinionated frameworks, and review coding, test, and automation work of team members Comply with organizations processes. Policies and protects organization’s Intellectual property. Also, participate in organization level process improvement and knowledge sharing Must have soft skills as well as technical in order to Communicate, collaborate and work effectively in a global environment. Public speaking as a technology evangelist for Mastercard Must have the ability to confidently and quickly make a decision is the hustle-bustle environment in order to maintain the ability of rapid development and deployment of new coding changes All About You Bachelor's degree in Information Systems, Information Technology, Computer Science, Engineering or equivalent work experience. Tech leadership experience with development teams Expert experience with performing code reviews, creating system design, and mentoring junior level developers As recognized subject matter expert, lead planning, design and implementation of technical solutions Knowledge of Design Patterns Strong communication skills working with business partners and other key project contacts Knowledge of cloud native development such as pivotal cloud foundry etc. Extensive experience in designing and developing consumer facing products is a must have. Experience with software development methodologies, particularly with Agile/Scrum methodologies. Delivered scalable products through a CI/CD pipeline deploying on-prem or in a public cloud infrastructure. Ensure quality across the full stack via rigorous CI/CD practices in all aspects of the SDLC (Builds, Test, and Deploy). Preferred Skills =================================== Extensive knowledge and experience with either with Java/ J2EE, Spring, Springboot, Hibernate, Web Services and Oracle SQL development, FS developer OR react and Angular and UI technologies Experienced in event driven systems (Apache Kafka, NATS, etc) Hands on experience of Angular/React will be an added advantage Experience with testing frameworks and methodologies (Gtest, JUnit, mocking, etc.) Experienced in building platforms with Microservice architecture and RESTful APIs. Experience using cloud-native approaches running on Linux, leveraging Spring Boot Experience with virtualization like Cloud Foundry (PCF), Kubernetes (PKS), Docker etc. Experience in CI/CD pipeline creation via Jenkin Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines.
Posted 1 day ago
5.0 - 12.0 years
0 Lacs
Coimbatore, Tamil Nadu, India
On-site
Data Software Engineer Location:Chennai and Coimbatore Mode:Hybrid Interview:Walkin 5-12 Years of in Big Data & Data related technology experience Expert level understanding of distributed computing principles Expert level knowledge and experience in Apache Spark Hands on programming with Python Proficiency with Hadoop v2, Map Reduce, HDFS, Sqoop Experience with building stream-processing systems, using technologies such as Apache Storm or Spark-Streaming Good understanding of Big Data querying tools, such as Hive, and Impala Experience with integration of data from multiple data sources such as RDBMS (SQL Server, Oracle), ERP, Files Good understanding of SQL queries, joins, stored procedures, relational schemas Experience with NoSQL databases, such as HBase, Cassandra, MongoDB Knowledge of ETL techniques and frameworks Performance tuning of Spark Jobs Experience with AZURE Databricks Ability to lead a team efficiently Experience with designing and implementing Big data solutions Practitioner of AGILE methodology
Posted 1 day ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Role: We are seeking a highly skilled and experienced Data Architect with expertise in designing and building data platforms in cloud environments. The ideal candidate will have a strong background in either AWS Data Engineering or Azure Data Engineering, along with proficiency in distributed data processing systems like Spark. Additionally, proficiency in SQL, data modeling, building data warehouses, and knowledge of ingestion tools and data governance are essential for this role. The Data Architect will also need experience with orchestration tools such as Airflow or Dagster and proficiency in Python, with knowledge of Pandas being beneficial. Why Choose Ideas2IT Ideas2IT has all the good attributes of a product startup and a services company. Since we launch our products, you will have ample opportunities to learn and contribute. However, single-product companies stagnate in the technologies they use. In our multiple product initiatives and customer-facing projects, you will have the opportunity to work on various technologies. AGI is going to change the world. Big companies like Microsoft are betting heavily on this (see here and here). We are following suit. What’s in it for you? You will get to work on impactful products instead of back-office applications for the likes of customers like Facebook, Siemens, Roche, and more You will get to work on interesting projects like the Cloud AI platform for personalized cancer treatment Opportunity to continuously learn newer technologies Freedom to bring your ideas to the table and make a difference, instead of being a small cog in a big wheel Showcase your talent in Shark Tanks and Hackathons conducted in the company Here’s what you’ll bring Experience in designing and building data platforms in any cloud. Strong expertise in either AWS Data Engineering or Azure Data Engineering Develop and optimize data processing pipelines using distributed systems like Spark. • Create and maintain data models to support efficient storage and retrieval. Build and optimize data warehouses for analytical and reporting purposes, utilizing technologies such as Postgres, Redshift, Snowflake, etc. Knowledge of ingestion tools such as Apache Kafka, Apache Nifi, AWS Glue, or Azure Data Factory. Establish and enforce data governance policies and procedures to ensure data quality and security. Utilize orchestration tools like Airflow or Dagster to schedule and manage data workflows. Develop scripts and applications in Python to automate tasks and processes. Collaborate with stakeholders to gather requirements and translate them into technical specifications. Communicate technical solutions effectively to clients and stakeholders. Familiarity with multiple cloud ecosystems such as AWS, Azure, and Google Cloud Platform (GCP). Experience with containerization and orchestration technologies like Docker and Kubernetes. Knowledge of machine learning and data science concepts. Experience with data visualization tools such as Tableau or Power BI. Understanding of DevOps principles and practices. About Us: Ideas2IT stands at the intersection of Technology, Business, and Product Engineering, offering high-caliber Product Development services. Initially conceived as a CTO consulting firm, we've evolved into thought leaders in cutting-edge technologies such as Generative AI, assisting our clients in embracing innovation. Our forte lies in applying technology to address business needs, demonstrated by our track record of developing AI-driven solutions for industry giants like Facebook, Bloomberg, Siemens, Roche, and others. Harnessing our product-centric approach, we've incubated several AI-based startups—including Pipecandy, Element5, IdeaRx, and Carefi. in—that have flourished into successful ventures backed by venture capital. With fourteen years of remarkable growth behind us, we're steadfast in pursuing ambitious objectives.
Posted 1 day ago
15.0 years
0 Lacs
India
On-site
Hiring for an US Based Product Company Position: Senior Cloud Infrastructure Engineer Experience: 15+ Years Roles & Responsibilities: Cloud experience: AWS, Azure. Kubernetes expert – Container orchestration and management Harness – Kubernetes Cost Control With Built-In Intelligence Densify – Kubernetes resource optimization Strong experience in CI/CD PROCESS AND TOOLS Strong knowledge and experience in Disaster Recovery / Business Continuity Plans Tools: Terraform, Ansible – Cloud infrastructure automation, Pulumi – IaC platform for any programming language, Jenkins – CI/CD for complex workflows, GitHub Actions – Native CI/CD built into GitHub etc. Cloud Testing Tools : Apache JMeter – Open-source load testing for web and APIs, BlazeMeter – Scalable cloud-based load testing platform, LoadRunner – Performance testing software tool etc. Cloud Provisioning Tools : AWS CloudFormation, Azure Resource Manager – Native Provisioning for Azure, Google Cloud Infrastructure Manager – Native Terraform provisioning for GCP, Cloudsfer – Cloud migration tool etc. Multi-Cloud Management Solutions: Lacework FortiCNAPP – Cloud security tool, Cloudify – Open-source, multi-cloud orchestration platform, CoreStack – Next-gen cloud business accelerator etc. Data Integration And Management Platforms: Informatica – Cloud data integration platform, AWS hosting, managing and deployment would be a big plus including archiving, backup and restore, cloud migration, DevOps. Interested can share resume at chandni@thepremierconsultants.com #aws #azure #cloudinfrastructure #devops #kubernates #terraform #cloudsecurity #dockeransible
Posted 1 day ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Company Description Experian is a global data and technology company, powering opportunities for people and businesses around the world. We help to redefine lending practices, uncover and prevent fraud, simplify healthcare, create marketing solutions, all using our unique combination of data, analytics and software. We also assist millions of people to realise their financial goals and help them save time and money. We operate across a range of markets, from financial services to healthcare, automotive, agribusiness, insurance, and many more industry segments. We invest in people and new advanced technologies to unlock the power of data. As a FTSE 100 Index company listed on the London Stock Exchange (EXPN), we have a team of 22,500 people across 32 countries. Our corporate headquarters are in Dublin, Ireland. Learn more at experianplc.co m Job Description Job description We are looking for a Senior Software Engineer to join our Ascend Cloud Foundation Platform team. Background We unlock the power of data to create opportunities for consumers, businesses and society. At life’s big moments – from buying a home or car, to sending a child to university, to growing a business exponentially by connecting it with new customers – we empower consumers and our clients to manage their data with confidence so they can maximize every opportunity. We require a senior software engineer in Hyderabad, India to work alongside our UK colleagues to deliver business outcomes for the UK&I region. You will join an established agile technical team, where you will work with the Lead Engineer and Product Owner to help develop the consumer data attributes, work with data analytics to validate the accuracy of the calculations whilst ensuring that you work to the highest technical standards. Key Responsibilities Design, develop, and maintain scalable and efficient data pipelines and ETL processes to extract, transform, and load data from various sources into our data lake or warehouse. Collaborate with cross-functional teams including data scientists, analysts, and software engineers to understand data requirements, define data models, and implement solutions that meet business needs. Ensure the security, integrity, and quality of data throughout the data lifecycle, implementing best practices for data governance, encryption, and access control. Develop and maintain data infrastructure components such as data warehouses, data lakes, and data processing frameworks, leveraging cloud services (e.g., AWS, Azure, GCP) and containerization technologies (e.g., Docker, Kubernetes). Implement monitoring, logging, and alerting mechanisms to ensure the reliability and availability of data pipelines and systems, and to proactively identify and address issues. Work closely with stakeholders to understand business requirements, prioritize tasks, and deliver solutions in a timely manner within an Agile working environment. Collaborate with the risk, security and compliance teams to ensure adherence to regulatory requirements (e.g., GDPR, PCI DSS) and industry standards related to data privacy and security. Stay updated on emerging technologies, tools, and best practices in the field of data engineering, and propose innovative solutions to improve efficiency, performance, and scalability. Mentor and coach junior engineers, fostering a culture of continuous learning and professional development within the team. Participate in code reviews, design discussions, and other Agile ceremonies to promote collaboration, transparency, and continuous improvement. Qualifications Qualifications Qualified to Degree, HND or HNC standard in a software engineering and/or data engineering discipline or can demonstrate commercial experience Required Skills/ Experience Experience of the full development lifecycle Strong communication skills with the ability to explain solutions to technical and non-technical audiences Write clean, scalable and re-usable code that implements SOLID principles, common design patterns where applicable and adheres to published coding standards Excellent attention to detail, ability to analyse, investigate and compare large data sets when required. 3 or more years of programming using Scala 2 or more years of programming using Python Some experience of using Terraform to provision and deploy cloud services and components Experience of developing on Apache Spark Experience of developing with AWS cloud services including (but not limited to) AWS Glue, S3, Step Functions, Lambdas, EventBridge and SQS BDD / TDD experience Jenkins CI / CD experience Application Lifecycle Management Tools - BitBucket & Jira Performing Pull Request reviews Understanding of Agile methodologies Automated Testing Tools Advantageous Experience Mentoring or coaching junior engineers Cloud Solution Architecture Document databases Relational Databases Experience with Container technologies (e.g. Kubernetes) Would Consider Alternative Skills And Experience Java (rather than Scala) Google Cloud or Microsoft Azure (rather than AWS) Azure Pipelines or TeamCity (rather than Jenkins) Github (rather than BitBucket) Azure DevOps (rather than Jira) CloudFormation (rather than Terraform) Additional Information Our uniqueness is that we celebrate yours. Experian's culture and people are important differentiators. We take our people agenda very seriously and focus on what matters; DEI, work/life balance, development, authenticity, collaboration, wellness, reward & recognition, volunteering... the list goes on. Experian's people first approach is award-winning; World's Best Workplaces™ 2024 (Fortune Global Top 25), Great Place To Work™ in 24 countries, and Glassdoor Best Places to Work 2024 to name a few. Check out Experian Life on social or our Careers Site and Glassdoor to understand why. Experian is proud to be an Equal Opportunity and Affirmative Action employer. Innovation is a critical part of Experian's DNA and practices, and our diverse workforce drives our success. Everyone can succeed at Experian and bring their whole self to work, irrespective of their gender, ethnicity, religion, color, sexuality, physical ability or age. If you have a disability or special need that requires accommodation, please let us know at the earliest opportunity. Benefits Experian care for employee's work life balance, health, safety and wellbeing. In support of this endeavor, we offer best-in-class family well-being benefits, enhanced medical benefits and paid time off. Experian Careers - Creating a better tomorrow together Find out what its like to work for Experian by clicking here
Posted 1 day ago
0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Database Engineer – DevOps | Java/Python 📍 Mumbai | 🏢 4 days/week on-site | 🕒 Full-time | 📌 Permanent Company: Vlink Vlink is looking for a skilled Database + DevOps Engineer to join our team in Mumbai ! You’ll work on high-impact financial data systems used globally, blending software engineering, automation, and database expertise to drive scalability and performance. If you have strong experience in Java OR Python , DevOps tools , and database development — we want to hear from you! 🔧 What You’ll Work On ✅ Automate database provisioning, configuration & monitoring ✅ Build internal APIs and self-service tools for DB ops ✅ Drive schema migrations, backups, and disaster recovery planning ✅ Collaborate with DBAs to automate and scale operations ✅ Contribute to CI/CD, IaC, and platform reliability initiatives ✅ Must-Have Skills Strong coding in Java or Python (Go is a plus), with experience building production-grade automation or tooling Solid DevOps experience , including: Source control: Git, Azure DevOps CI/CD pipeline development Automation tools : Ansible / AWX Operating systems : Linux and Windows Hands-on experience with databases , such as: Relational: PostgreSQL, SQL Server, Oracle, SAP Sybase ASE NoSQL: Apache Cassandra, CosmosDB Data warehouse: Snowflake, Greenplum, SAP Sybase IQ Experience with cloud platforms : AWS, Azure, or GCP, including managed database services Familiar with : Data replication tools : Sybase Replication, MSSQL HA, HVR (FiveTran) Monitoring, alerting, and performance tuning for infrastructure and databases 👉 Ready to build the future of database operations with us? Apply now or reach out via DM to learn more!
Posted 1 day ago
6.0 years
0 Lacs
India
On-site
Job Description: Responsibilities: Develop and implement data models and algorithms to solve complex business problems. Utilize Databricks to manage and analyse large datasets efficiently. Collaborate with cross-functional teams to understand business requirements and deliver data-driven insights. Design and build scalable data pipelines and ETL processes. Perform data exploration, preprocessing, and feature engineering. Conduct statistical analysis and machine learning model development. Communicate findings and insights to stakeholders through data visualization and reports. Stay current with industry trends and best practices in data science and big data technologies. Requirements: Minimum 6 years of experience as a Data Scientist Required. Proven experience as a Data Scientist or similar role. Proficiency with Databricks and its ecosystem. Strong programming skills in Python, R, or Scala. Experience with big data technologies such as Apache Spark, Databricks. Knowledge of SQL and experience with relational databases. Familiarity with cloud platforms (e.g., AWS, Azure, Google Cloud). Strong analytical and problem-solving skills. Excellent communication and teamwork abilities. Bachelor's degree in Data Science, Computer Science, Statistics, or a related field (or equivalent experience). Preferred Qualifications: Advanced degree (Master's or Ph.D.) in a relevant field. Experience with machine learning frameworks (e.g., TensorFlow, PyTorch). Knowledge of data visualization tools (e.g., Tableau, Power BI). Familiarity with version control systems (e.g., Git).
Posted 1 day ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Avant de postuler à un emploi, sélectionnez votre langue de préférence parmi les options disponibles en haut à droite de cette page. Découvrez votre prochaine opportunité au sein d'une organisation qui compte parmi les 500 plus importantes entreprises mondiales. Envisagez des opportunités innovantes, découvrez notre culture enrichissante et travaillez avec des équipes talentueuses qui vous poussent à vous développer chaque jour. Nous savons ce qu’il faut faire pour diriger UPS vers l'avenir : des personnes passionnées dotées d’une combinaison unique de compétences. Si vous avez les qualités, de la motivation, de l'autonomie ou le leadership pour diriger des équipes, il existe des postes adaptés à vos aspirations et à vos compétences d'aujourd'hui et de demain. Job Summary Fiche de poste : This position participates in the design, build, test, and delivery of Machine Learning (ML) models and software components that solve challenging business problems for the organization, working in collaboration with the Business, Product, Architecture, Engineering, and Data Science teams. This position engages in assessment and analysis of data sources of structured and unstructured data (internal and external) to uncover opportunities for ML and Artificial Intelligence (AI) automation, predictive methods, and quantitative modeling across the organization. This position establishes and configures scalable and cost-effective end to end solution design pattern components to support prediction model transactions. This position designs trials and tests to measure the success of software and systems, and works with teams, or individually, to implement ML/AI models for production scale. Responsibilities The MLOPS developer works on maintaining existing models that are supporting applications such as the digital insurance application and claims recommendation engine. They will be responsible for setting up cloud monitoring jobs, performing quality assurance and testing for edge cases to ensure the ML product works within the application. They are also going to need to be on call on weekends to bring the application back online in case of failure. Studies and transforms data science prototypes into ML systems using appropriate datasets and data representation models. Researches and implements appropriate ML algorithms and tools that creates new systems and processes powered with ML and AI tools and techniques according to business requirements Collaborates with others to deliver ML products and systems for the organization. Designs workflows and analysis tools to streamline the development of new ML models at scale. Creates and evolves ML models and software that enable state-of-the-art intelligent systems using best practices in all aspects of engineering and modelling lifecycles. Extends existing ML libraries and frameworks with the developments in the Data Science and Machine Learning field. Establishes, configures, and supports scalable Cloud components that serve prediction model transactions Integrates data from authoritative internal and external sources to form the foundation of a new Data Product that would deliver insights that supports business outcomes necessary for ML systems. Qualifications Requirements: Ability to code in python/spark with enough knowledge of apache to build apache beam jobs in dataproc to build data transfer jobs. Experience designing and building data-intensive solutions using distributed computing within a multi-line business environment. Familiarity in Machine Learning and Artificial Intelligence frameworks (i.e., Keras, PyTorch), libraries (i.e., scikit-learn), and tools and Cloud-AI technologies that aids in streamlining the development of Machine Learning or AI systems. Experience in establishing and configuring scalable and cost-effective end to end solution design pattern components to support the serving of batch and live streaming prediction model transactions Possesses creative and critical thinking skills. Experience in developing Machine Learning models such as: Classification/Regression Models, NLP models, and Deep Learning models; with a focus on productionizing those models into product features. Experience with scalable data processing, feature development, and model optimization. Solid understanding of statistics such as forecasting, time series, hypothesis testing, classification, clustering or regression analysis, and how to apply that knowledge in understanding and evaluating Machine Learning models. Knowledgeable in software development lifecycle (SDLM), Agile development practices and cloud technology infrastructures and patterns related to product development Advanced math skills in Linear Algebra, Bayesian Statistics, Group Theory. Works collaboratively, both in a technical and cross-functional context. Strong written and verbal communication. Bachelors’ (BS/BA) degree in a quantitative field of mathematics, computer science, physics, economics, engineering, statistics (operations research, quantitative social science, etc.), international equivalent, or equivalent job experience. Type De Contrat en CDI Chez UPS, égalité des chances, traitement équitable et environnement de travail inclusif sont des valeurs clefs auxquelles nous sommes attachés.
Posted 1 day ago
3.0 years
0 Lacs
Greater Kolkata Area
On-site
Job brief •We are looking for a Software Developer specializing in .Net to build software using languages and technologies of the .NET framework. •You should be a pro with third-party API integration and application programming. •In this role, you should be able to write smooth & functional code with a sharp eye for spotting defects. •You should be a team player and an excellent communicator. •If you are also passionate about the .NET framework and software design/architecture, we’d like to meet you. •Your goal will be to work with internal teams to design, develop and maintain software. Responsibilities •Participate in requirements analysis. • Work in a development team to develop integrated ASP.NET applications. •Write clean, scalable code using ASP.NET Framework, programming language (C#), and Rest API. •Write SQL Server Queries and normalize SQL table structure. •Revise, update, refactor and debug code. •Improve existing software •Develop documentation throughout the software development life cycle (SDLC) •Serve as an expert on applications and provide technical support Requirements •Required at least 3 years of Software development using asp.net, MVC, C#, web application forms, API Integrations. •Hands on experience in SQL Server, and design/architectural patterns for .NET Framework Web Application) •Experienced in Bootstrap, jQuery, HTML, CSS3 and XML •Experienced with architecture styles/APIs (REST, Web API, Json) •Excellent troubleshooting skills • Excellent English communication skills to be able to work with a global team. (This is Mandatory) •BSc/Btech/BCA in Computer Science, Engineering, or a related field Must Have Skill Set: Asp.net (Core) C# SQL/NoSQL (Microsoft SQL, PostgreSQL, SQLite etc) Modern frontend frameworks (Blazor, React etc) Third Party SOAP and Rest API Integrations HTML & CSS JavaScript jQuery Bootstrap Knowledge of standard unit testing tools such as Jenkins Good to have skill set: .NET MVC .NET MAUI (Xamarin) Experience with CRM development Experience in the ISP, Telephony and MSP industries Experience with Apache HTTP & Nginx Experience with Debian & Debian based Linux server distributions (e.g Ubuntu) Other Details: •Shift Timings: 1:15 to 10:30pm – Monday to Friday 1:15 to 6:30pm on Alternate Saturdays •Work Mode: Fulltime & Onsite. •Drop Facilities provided •Medical Insurance cover for you and your family •Free Café facilities Our Brands: •https://www.v4consumer.co.uk/ •https://www.v4one.co.uk/ TO APPLY : Please mail your updated resumes to puja.ganguly@salescom.in
Posted 1 day ago
0.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Risk Management Level Associate Job Description & Summary A career within Internal Audit services, will provide you with an opportunity to gain an understanding of an organisation’s objectives, regulatory and risk management environment, and the diverse needs of their critical stakeholders. We focus on helping organisations look deeper and see further considering areas like culture and behaviours to help improve and embed controls. In short, we seek to address the right risks and ultimately add value to their organisation. At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true saelves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within…. Responsibilities: Architecture Design: · Design and implement scalable, secure, and high-performance architectures for Generative AI applications. · Integrate Generative AI models into existing platforms, ensuring compatibility and performance optimization. Model Development and Deployment: · Fine-tune pre-trained generative models for domain-specific use cases. · Data Collection, Sanitization and Data Preparation strategy for Model fine tuning. · Well versed with machine learning algorithms like Supervised, unsupervised and Reinforcement learnings, Deep learning. · Well versed with ML models like Linear regression, Decision trees, Gradient boosting, Random Forest and K-means etc. · Evaluate, select, and deploy appropriate Generative AI frameworks (e.g., PyTorch, TensorFlow, Crew AI, Autogen, Langraph, Agentic code, Agent flow). Innovation and Strategy: · Stay up to date with the latest advancements in Generative AI and recommend innovative applications to solve complex business problems. · Define and execute the AI strategy roadmap, identifying key opportunities for AI transformation. · Good exposure to Agentic Design patterns Collaboration and Leadership: · Collaborate with cross-functional teams, including data scientists, engineers, and business stakeholders. · Mentor and guide team members on AI/ML best practices and architectural decisions. · Should be able to lead a team of data scientists, GenAI engineers and Software Developers. Performance Optimization: · Monitor the performance of deployed AI models and systems, ensuring robustness and accuracy. · Optimize computational costs and infrastructure utilization for large-scale deployments. Ethical and Responsible AI: · Ensure compliance with ethical AI practices, data privacy regulations, and governance frameworks. · Implement safeguards to mitigate bias, misuse, and unintended consequences of Generative AI. Mandatory skill sets: · Advanced programming skills in Python and fluency in data processing frameworks like Apache Spark. · Experience with machine learning, artificial Intelligence frameworks models and libraries (TensorFlow, PyTorch, Scikit-learn, etc.). · Should have strong knowledge on LLM’s foundational model (OpenAI GPT4o, O1, Claude, Gemini etc), while need to have strong knowledge on opensource Model’s like Llama 3.2, Phi etc. · Proven track record with event-driven architectures and real-time data processing systems. · Familiarity with Azure DevOps and other LLMOps tools for operationalizing AI workflows. · Deep experience with Azure OpenAI Service and vector DBs, including API integrations, prompt engineering, and model fine-tuning. Or equivalent tech in AWS/GCP. · Knowledge of containerization technologies such as Kubernetes and Docker. · Comprehensive understanding of data lakes and strategies for data management. · Expertise in LLM frameworks including Langchain, Llama Index, and Semantic Kernel. · Proficiency in cloud computing platforms such as Azure or AWS. · Exceptional leadership, problem-solving, and analytical abilities. · Superior communication and collaboration skills, with experience managing high-performing teams. · Ability to operate effectively in a dynamic, fast-paced environment. Preferred skill sets: · Experience with additional technologies such as Datadog, and Splunk. · Programming languages like C#, R, Scala · Possession of relevant solution architecture certificates and continuous professional development in data engineering and Gen AI. Years of experience required: 0-1 Years Education qualification: · BE / B.Tech / MCA / M.Sc / M.E / M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor in Business Administration, Master of Business Administration, Bachelor of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Java Optional Skills Accepting Feedback, Accepting Feedback, Accounting and Financial Reporting Standards, Active Listening, Artificial Intelligence (AI) Platform, Auditing, Auditing Methodologies, Business Process Improvement, Communication, Compliance Auditing, Corporate Governance, Data Analysis and Interpretation, Data Ingestion, Data Modeling, Data Quality, Data Security, Data Transformation, Data Visualization, Emotional Regulation, Empathy, Financial Accounting, Financial Audit, Financial Reporting, Financial Statement Analysis, Generally Accepted Accounting Principles (GAAP) {+ 19 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date
Posted 1 day ago
0.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Risk Management Level Associate Job Description & Summary A career within Internal Audit services, will provide you with an opportunity to gain an understanding of an organisation’s objectives, regulatory and risk management environment, and the diverse needs of their critical stakeholders. We focus on helping organisations look deeper and see further considering areas like culture and behaviours to help improve and embed controls. In short, we seek to address the right risks and ultimately add value to their organisation. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true saelves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within…. Responsibilities: Architecture Design: · Design and implement scalable, secure, and high-performance architectures for Generative AI applications. · Integrate Generative AI models into existing platforms, ensuring compatibility and performance optimization. Model Development and Deployment: · Fine-tune pre-trained generative models for domain-specific use cases. · Data Collection, Sanitization and Data Preparation strategy for Model fine tuning. · Well versed with machine learning algorithms like Supervised, unsupervised and Reinforcement learnings, Deep learning. · Well versed with ML models like Linear regression, Decision trees, Gradient boosting, Random Forest and K-means etc. · Evaluate, select, and deploy appropriate Generative AI frameworks (e.g., PyTorch, TensorFlow, Crew AI, Autogen, Langraph, Agentic code, Agent flow). Innovation and Strategy: · Stay up to date with the latest advancements in Generative AI and recommend innovative applications to solve complex business problems. · Define and execute the AI strategy roadmap, identifying key opportunities for AI transformation. · Good exposure to Agentic Design patterns Collaboration and Leadership: · Collaborate with cross-functional teams, including data scientists, engineers, and business stakeholders. · Mentor and guide team members on AI/ML best practices and architectural decisions. · Should be able to lead a team of data scientists, GenAI engineers and Software Developers. Performance Optimization: · Monitor the performance of deployed AI models and systems, ensuring robustness and accuracy. · Optimize computational costs and infrastructure utilization for large-scale deployments. Ethical and Responsible AI: · Ensure compliance with ethical AI practices, data privacy regulations, and governance frameworks. · Implement safeguards to mitigate bias, misuse, and unintended consequences of Generative AI. Mandatory skill sets: · Advanced programming skills in Python and fluency in data processing frameworks like Apache Spark. · Experience with machine learning, artificial Intelligence frameworks models and libraries (TensorFlow, PyTorch, Scikit-learn, etc.). · Should have strong knowledge on LLM’s foundational model (OpenAI GPT4o, O1, Claude, Gemini etc), while need to have strong knowledge on opensource Model’s like Llama 3.2, Phi etc. · Proven track record with event-driven architectures and real-time data processing systems. · Familiarity with Azure DevOps and other LLMOps tools for operationalizing AI workflows. · Deep experience with Azure OpenAI Service and vector DBs, including API integrations, prompt engineering, and model fine-tuning. Or equivalent tech in AWS/GCP. · Knowledge of containerization technologies such as Kubernetes and Docker. · Comprehensive understanding of data lakes and strategies for data management. · Expertise in LLM frameworks including Langchain, Llama Index, and Semantic Kernel. · Proficiency in cloud computing platforms such as Azure or AWS. · Exceptional leadership, problem-solving, and analytical abilities. · Superior communication and collaboration skills, with experience managing high-performing teams. · Ability to operate effectively in a dynamic, fast-paced environment. Preferred skill sets: · Experience with additional technologies such as Datadog, and Splunk. · Programming languages like C#, R, Scala · Possession of relevant solution architecture certificates and continuous professional development in data engineering and Gen AI. Years of experience required: 0-1 Years Education qualification: · BE / B.Tech / MCA / M.Sc / M.E / M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor in Business Administration, Master of Business Administration, Bachelor of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Java Optional Skills Accepting Feedback, Accepting Feedback, Accounting and Financial Reporting Standards, Active Listening, Artificial Intelligence (AI) Platform, Auditing, Auditing Methodologies, Business Process Improvement, Communication, Compliance Auditing, Corporate Governance, Data Analysis and Interpretation, Data Ingestion, Data Modeling, Data Quality, Data Security, Data Transformation, Data Visualization, Emotional Regulation, Empathy, Financial Accounting, Financial Audit, Financial Reporting, Financial Statement Analysis, Generally Accepted Accounting Principles (GAAP) {+ 19 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date
Posted 1 day ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough