Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4 - 8 years
900 - 1000 Lacs
Chennai
Remote
Join us and be a part of this journey as we write customer success stories about these products. WHAT YOU DO Interface with business customers, gathering and understanding requirements. Interface with customer and Genesys data science teams in discovery, extraction, loading, data transformation, and analysis of results. Define and utilize data intuition process to cleanse and verify the integrity of customer & Genesys data to be used for analysis. Implement, own, and improve data pipelines using best practices in data modeling, ETL/ELT processes. Build, improve, and provide ongoing optimization of high-quality models. Work with PS & Engineering to deliver specific customer requirements and report back customer feedback, issues, and feature requests. Continuous improvement in reporting, analysis, and overall process. Visualize, present, and demonstrate findings as required. Perform knowledge transfer to customer and internal teams. Communicate within the global community respecting cultural, language, and time zone variations. Demonstrate flexibility to adjust working hours to match customer and team interactions. ABOUT YOU Bachelors / Masters degree in quantitative field (e.g., Computer Science, Statistics, Engineering) 5+ years of relevant experience in Data Science or Data Engineering 5+ years of hands-on experience in Elasticsearch, Kibana, and real-time analytics solution development Hands-on application development experience in AWS/Azure and experience in Snowflake, Tableau, or Power BI Expertise with major statistical & analytical software like Python, R, SAS Good working knowledge on any programming language like Java, NodeJS. Application development background of using any contact center product suites such as Genesys, Avaya, Cisco etc. is an added advantage Expertise with data modeling, data warehousing, and ETL/ELT development Expertise with database solutions such as SQL, MongoDB, Redshift, Hadoop, Hive Proficiency with REST API, JSON, AWS Experience in working and delivering projects independently. Ability to multi-task and context switch between projects and tasks Curiosity, passion, and drive for data queries, analysis, quality, and models Excellent communication, initiative, and coordination skills with great attention to detail. Ability to explain and discuss complex topics with both experts and business leaders.
Posted 2 months ago
2 - 7 years
4 - 9 Lacs
Pune
Work from Office
Project Role : Software Development Lead Project Role Description : Develop and configure software systems either end-to-end or for a specific stage of product lifecycle. Apply knowledge of technologies, applications, methodologies, processes and tools to support a client, project or entity. Must have skills : Splunk Good to have skills : NA Minimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Please refer below details for Position Name - SPLUNK/ELK Developer – Professional & Technical Skills: Must Have Skills:Proficiency in Splunk & ELK Administration and Development Must Have Skills:Hands on Experience with ELK Stack components (Elasticsearch, Logstash, Kibana) and their seamless integration Log management:Utilize the ELK stack to collect, process and analyze log data ,ensuring efficient log management and searchability Familiarity with Kibana dashboard creation, Health checks ,Linux system administration, Shell scripting and any one cloud platform Develop fields extraction, lookups, and data transformations to ensure accurate and meaningful data analysis Creating dashboards, alerts, saved searches, lookups, macros, field extractions, field transformations, tags, event types. Experience in architecting and administering Splunk distributed environments with components like Universal/Heavy Forwarders, Indexers, Cluster Masters, Deployment Servers , Search Head, License Master and Search Head Cluster. Manage and edit various .conf files such as index.conf, input.conf, output.conf, props.conf, transform.conf, server.conf Experience on log parsing, complex Splunk searches, external table lookup Create and manage KPIs, Glass Tables, and Service Health Scores to provide real-time visibility into IT operations Installed, configured, and maintained Splunk, Splunk Add Ons and Apps Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: The candidate should have a minimum of 2 years of experience in Splunk. This position is based at our Pune office. A 15 years full time education is required. Qualifications 15 years full time education
Posted 2 months ago
0 years
0 Lacs
Hyderabad, Telangana
Work from Office
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by diversity and inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health equity on a global scale. Join us to start Caring. Connecting. Growing together. We are seeking a talented Solutions Architect with a keen interest in Amazon Connect-based contact center solutions and telephony. The ideal candidate will have foundational knowledge in contact center technologies and a solid desire to learn and grow alongside industry experts. This role offers the opportunity to work on cutting-edge projects involving AI, GenAI, and cloud-based platforms Primary Responsibilities: Solution Design Support Assist in designing and developing Amazon Connect-based contact center solutions Contribute to focus areas such as Product Development, Data & Analytics, Routing, Desktop/CTI, WFM/WFO, and SBC/Telephony Participate in integrating AI and GenAI technologies into contact center solutions Technical Contribution Work hands-on to create accelerators and tools that enhance the productivity of engineering and delivery teams Support the collection of requirements and assist in converting them into technical specifications in collaboration with engineering and delivery teams Problem Solving & Support Help address production and non-production issues by providing timely support and solutions Participate in the end-to-end process from feature grooming to Day 2 support Collaboration & Communication Collaborate with cross-functional teams to understand project requirements and deliverables Clearly articulate ideas and technical concepts in both written and verbal formats Utilize design tools like Draw.io, PlantUML, Mermaid, PowerPoint, Miro, and Figma to create documentation and diagrams Learning & Development Develop a deep understanding of application, technology, and data architecture principles Acquire knowledge of protocol stacks and data entities relevant to contact center technologies Stay updated with industry standards and technologies from vendors like Amazon, Google, Microsoft, Oracle, etc. Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: Completion of Graduate degree Hands-on experience with Cloud native tech stack: Experience working with diverse technologies: Java, Public Cloud (Azure), Cloud (Docker, Microservices/SpringBoot), RDBMS (MySQL) + nosql (Cassandra, MongoDB, Elastic), APIs (REST, Graph QL), API gateways (Kong etc.), Data Streaming (Kafka), Visualization (Grafana, Kibana), ELK stack (Elastic, Logstash, and Kibana); API Gateway, Gen AI, AI/ML Experience in solution architecture with a focus on contact center technologies and telephony systems. This could include experience as full stack engineer in the mentioned platforms Experience with design and documentation tools such as Draw.io, PlantUML, Mermaid, PowerPoint, Word, Excel, Miro, and Figma Basic understanding of application development and architecture principles Proven solid communication and interpersonal skills Proven ability to articulate thoughts clearly and effectively in written and verbal communication Proven eagerness to learn and adapt to new technologies and methodologies Proven analytical mindset with problem-solving abilities Preferred Qualifications: Certifications such as AWS Certified Cloud Practitioner or AWS Certified Developer - Associate Experience with agile development processes and collaboration tools Exposure to AI and GenAI technologies Knowledge of cloud platforms (AWS, Azure, Google Cloud) and basic AI concepts Familiarity with agile methodologies and DevOps practices At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.
Posted 2 months ago
3 - 8 years
4 - 9 Lacs
Bangalore Rural, Bengaluru
Work from Office
Skills: Elasticsearch, Talend, Grafana Responsibilities: Build dashboards, manage clusters, optimize performance Tech: API, Python, cloud platforms (AWS, Azure, GCP) Preference: Immediate joiners Contact: 6383826448 || jensyofficial23@gmail.com
Posted 2 months ago
6 - 9 years
12 - 22 Lacs
Bengaluru
Hybrid
Skills Go, Kafka, RestAssured, REST Web Services, NoSQL ELK Stack (Elastic Search, Logstash, Kibana) Git (GitHub, GitLab, BitBucket, SVN) Postgres / Postgresql Couchbase, Jenkins, Docker, Kubernetes Details Key Responsibilities: Responsible for designing system solutions, developing custom applications, and modifying existing applications to meet distinct and changing business requirements. Handle coding, debugging, and documentation, as well working closely with SRE team. Provide post implementation and ongoing production support Develop and design software applications, translating user needs into system architecture.Assess and validate application performance and integration of component systems and provide process flow diagrams. Test the engineering resilience of software and automation tools. You will be challenged with identifying innovative ideas and proof of concept to deliver against the existing and future needs of our customers. Software Engineers who join our Loyalty Technology team will be assigned to one of several exciting teams that are developing a new, nimble, and modern loyalty platform which will support the key element of connecting with our customers where they are and how they choose to interact with American Express. Be part of an enthusiastic, high performing technology team developing solutions to drive engagement and loyalty within our existing cardmember base and attract new customers to the Amex brand. The position will also play a critical role partnering with other development teams, testing and quality, and production support, to meet implementation dates and allow smooth transition throughout the development life-cycle. The successful candidate will be focused on building and executing against a strategy and roadmap focused on moving from monolithic, tightly coupled, batch-based legacy platforms to a loosely coupled, event-driven, microservices-based architecture to meet our long-term business goals. Minimum Qualifications: Position requires a Bachelors degree in Computer Science, Engineering, or a related field followed by 6+ years of experience in a modern development stack Golang, Kafka, REST API Experience in application design, software development, and testing in an Agile environment. Experience with relational and NoSQL databases, including ElasticSearch, PostgreSQL, Couchbase, or Cassandra. Experience designing and developing REST APIs for high volume clients. Experience with continuous integration tools (Jenkins, Github). Experience with automated build and test frameworks a plus. Experience in American Express Technologies is highly desired. A proven hunger to learn new technologies and translate them into working software. Experience with container and container orchestration technologies, such as Docker and Kubernetes. Experience with Atlassian software development and collaboration tools (JIRA, Confluence, etc.). Strong ability to develop unique, outside the box ideas Strong analytical, problem-solving/quantitative skills Willing to take risks, experiment, and share fresh perspectives Aptitude for learning and applying programming concepts. Ability to effectively communicate with internal and external business partners. Preferred Additional: Knowledge of Loyalty/Rewards and Credit card industry Experience with coding skills across a variety of distributed technologies Experience with open-source frameworks is a plus especially maintaining or contributing to open source projects! Experience with a broad range of software languages and payments technologies
Posted 2 months ago
8.0 years
0 Lacs
Hyderabad, Telangana
On-site
General information Country India State Telangana City Hyderabad Job ID 42999 Department Development Experience Level MID_SENIOR_LEVEL Employment Status FULL_TIME Workplace Type Hybrid Description & Requirements As a software lead, you will play a critical role in defining and driving the architectural vision of our RPA product. You will ensure technical excellence, mentor engineering teams, and collaborate across departments to deliver innovative automation solutions. This is a unique opportunity to influence the future of RPA technology and make a significant impact on the industry. RESPONSIBILITIES: Define and lead the architectural design and development of the RPA product, ensuring solutions are scalable, maintainable, and aligned with organizational strategic goals. Provide technical leadership and mentor team members on architectural best practices. Analyze and resolve complex technical challenges, including performance bottlenecks, scalability issues, and integration challenges, to ensure high system reliability and performance. Collaborate with cross-functional stakeholders, including product managers, QA, and engineering teams, to define system requirements, prioritize technical objectives, and design cohesive solutions. Provide architectural insights during sprint planning and agile processes. Establish and enforce coding standards, best practices, and guidelines across the engineering team, conducting code reviews with a focus on architecture, maintainability, and future scalability. Develop and maintain comprehensive documentation for system architecture, design decisions, and implementation details, ensuring knowledge transfer and facilitating team collaboration. Architect and oversee robust testing strategies, including automated unit, integration, and regression tests, to ensure adherence to quality standards and efficient system validation. Research and integrate emerging technologies, particularly advancements in RPA and automation, to continually enhance the product’s capabilities and technical stack. Drive innovation and implement best practices within the team. Serve as a technical mentor and advisor to engineering teams, fostering professional growth and ensuring alignment with the overall architectural vision. Ensure that the RPA product adheres to security and compliance standards by incorporating secure design principles, conducting regular security reviews, and implementing necessary safeguards to protect data integrity, confidentiality, and availability. EDUCATION & EXPERIENCE: Bachelor’s or Master’s degree in Computer Science, Software Engineering, or a related field. 8+ years of professional experience in software development. REQUIRED SKILLS: Expertise in object-oriented programming languages such as Java, C#, or similar, with a strong understanding of design patterns and principles. Deep familiarity with software development best practices, version control systems (e.g., Git), and continuous integration/continuous delivery (CI/CD) workflows. Proven experience deploying and managing infrastructure on cloud platforms such as AWS, Azure, or Google Cloud, including knowledge of containerization technologies like Docker and orchestration tools like Kubernetes. Strong proficiency in architecting, building, and optimizing RESTful APIs and microservices, with familiarity in tools like Swagger/OpenAPI and Postman for design and testing Comprehensive knowledge of SQL databases (e.g., PostgreSQL, SQLServer) with expertise in designing scalable and reliable data models, including creating detailed Entity-Relationship Diagrams (ERDs) and optimizing database schemas for performance and maintainability. Demonstrated experience in building and maintaining robust CI/CD pipelines using tools such as Jenkins or GitLab CI. Demonstrated ability to lead teams in identifying and resolving complex software and infrastructure issues using advanced troubleshooting techniques and tools. Exceptional communication and leadership skills, with the ability to guide and collaborate with cross-functional teams, bridging technical and non-technical stakeholders. Excellent written and verbal communication skills, with a focus on documenting technical designs, code, and system processes clearly and concisely. Comfortable and experienced in agile development environments, demonstrating adaptability to evolving requirements and timelines while maintaining high productivity and focus on deliverables. Familiarity with security best practices in software development, such as OWASP guidelines, secure coding principles, and implementing authentication/authorization frameworks (e.g., OAuth, SAML, JWT). Experience with microservices architecture, message brokers (e.g., RabbitMQ, Kafka), and event-driven design. Extensive experience in performance optimization and scalability, with a focus on designing high-performance systems and utilizing profiling tools and techniques to optimize both code and infrastructure for maximum efficiency. PREFERRED SKILLS: Experience with serverless architecture, including deploying and managing serverless applications using platforms such as AWS Lambda, Azure Functions, or Google Cloud Functions, to build scalable, cost-effective solutions. Experience with RPA tools or frameworks (e.g., UiPath, Automation Anywhere, Blue Prism) is a plus. Experience with Generative AI technologies, including working with frameworks like TensorFlow, PyTorch, or Hugging Face, and integrating AI/ML models into software applications. Hands-on experience with data analytics or logging tools like ELK Stack (Elasticsearch, Logstash, Kibana) or Splunk for monitoring and troubleshooting application performance About Infor Infor is a global leader in business cloud software products for companies in industry specific markets. Infor builds complete industry suites in the cloud and efficiently deploys technology that puts the user experience first, leverages data science, and integrates easily into existing systems. Over 60,000 organizations worldwide rely on Infor to help overcome market disruptions and achieve business-wide digital transformation. For more information visit www.infor.com Our Values At Infor, we strive for an environment that is founded on a business philosophy called Principle Based Management™ (PBM™) and eight Guiding Principles: integrity, stewardship & compliance, transformation, principled entrepreneurship, knowledge, humility, respect, self-actualization. Increasing diversity is important to reflect our markets, customers, partners, and communities we serve in now and in the future. We have a relentless commitment to a culture based on PBM. Informed by the principles that allow a free and open society to flourish, PBM™ prepares individuals to innovate, improve, and transform while fostering a healthy, growing organization that creates long-term value for its clients and supporters and fulfillment for its employees. Infor is an Equal Opportunity Employer. We are committed to creating a diverse and inclusive work environment. Infor does not discriminate against candidates or employees because of their sex, race, gender identity, disability, age, sexual orientation, religion, national origin, veteran status, or any other protected status under the law. If you require accommodation or assistance at any time during the application or selection processes, please submit a request by following the directions located in the FAQ section at the bottom of the infor.com/about/careers webpage.
Posted 2 months ago
10 - 15 years
22 - 37 Lacs
Mumbai
Work from Office
Who We Are At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward – always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. The Role Job Description : As ELK architect (Elasticsearch, Logstash, and Kibana), you will be responsible for designing and implementing the data architecture and infrastructure for data analytics, log management, and visualization solutions using the ELK stack. You will collaborate with cross-functional teams, including data engineers, developers, system administrators, and stakeholders, to define data requirements, design data models, and ensure efficient data processing, storage, and retrieval. Your expertise in ELK and data architecture will be instrumental in building scalable and performant data solutions. Responsibilities: 1. Data Architecture Design : Collaborate with stakeholders to understand business requirements and define the data architecture strategy for ELK-based solutions. Design scalable and robust data models, data flows, and data integration patterns. 2. ELK Stack Implementation : Lead the implementation and configuration of ELK stack infrastructure to support data ingestion, processing, indexing, and visualization. Ensure high availability, fault tolerance, and optimal performance of the ELK environment. 3. Data Ingestion and Integration : Design and implement efficient data ingestion pipelines using Logstash or other relevant technologies. Integrate data from various sources, such as databases, APIs, logs, AppDynamics, storage and streaming platforms, into ELK for real-time and batch processing. 4. Data Modeling and Indexing : Design and optimize Elasticsearch indices and mappings to enable fast and accurate search and analysis. Define index templates, shard configurations, and document structures to ensure efficient storage and retrieval of data. 5. Data Visualization and Reporting : Collaborate with stakeholders to understand data visualization and reporting requirements. Utilize Kibana to design and develop visually appealing and interactive dashboards, reports, and visualizations that enable data-driven decision-making. 6. Performance Optimization : Analyze and optimize the performance of data processing and retrieval in ELK. Tune Elasticsearch settings, queries, and aggregations to improve search speed and response time. Optimize data storage, caching, and memory management. 7. Data Security and Compliance : Implement security measures and access controls to protect sensitive data stored in ELK. Ensure compliance with data privacy regulations and industry standards by implementing appropriate encryption, access controls, and auditing mechanisms. 8. Documentation and Collaboration : Create and maintain documentation of data models, data flows, system configurations, and best practices. Collaborate with cross-functional teams, providing guidance and support on data architecture and ELK-related topics Who You Are Candidate should have minimum 8+ years of experience. Apply Architectural Methods. Design Information System Architecture. Lead Systems Engineering Management. AD & AI leadership . Being You Diversity is a whole lot more than what we look like or where we come from, it’s how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we’re not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you – and everyone next to you – the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That’s the Kyndryl Way. What You Can Expect With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter – wherever you are in your life journey. Our employee learning programs give you access to the best learning in the industry to receive certifications, including Microsoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed. Get Referred! If you know someone that works at Kyndryl, when asked ‘How Did You Hear About Us’ during the application process, select ‘Employee Referral’ and enter your contact's Kyndryl email address.
Posted 2 months ago
0.0 - 5.0 years
0 Lacs
Indore, Madhya Pradesh
On-site
Job Information Job Opening ID ZR_672_JOB Date Opened 05/06/2025 Industry IT Services Work Experience 3-5 years Job Type Full time Salary Confidential City Indore State/Province Madhya Pradesh Country India Zip/Postal Code 452001 Job Description Company- is a rapidly growing, private equity backed SaaS product company that is powering some of the world’s most important missions. We were founded by engineers and built to give our outstanding teams the best possible environment to write great code to build world-class products that our customers love. We are looking for a DevOps engineer that can help refine our development processes and infrastructure to bring the latest technology and methodologies to the team. RESPONSIBILITIES: Manage, maintain, and deliver tasks on the DevOps roadmap. Help shape, maintain, and support large-scale distributed cloud infrastructure. Manage our ECS and EKS infrastructure. Exposure to infrastructure as code tools like terraform and ansible. Implement and manage containerization solutions using Docker and orchestration tools such as Kubernetes. Maintain continuous integration and deployment pipelines. Maintain documentation of infrastructure. Requirements REQUIREMENTS:- 2+ years as an engineer on a software development team. Prior experience working in cross-functional teams. Systems architecture and design skills. Proficiency in scripting languages such as Bash, Python, or PowerShell. Experience with CI/CD tools such as Jenkins, GitLab CI/CD, or CircleCI. Build and deployment automation experience especially in a containerized world. Proficiency with common ops tools (ECS, Logstash, Datadog + Kibana, EKS etc) Experience with AWS or Azure. Comfort maintaining live production systems. Strong communication and collaboration skills, with the ability to work effectively in a fast-paced team environment. Experience with microservices architectures and serverless computing. Knowledge of security best practices for cloud environments, including identity and access management, network security, and encryption. Benefits As per industry.
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough