Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
3.0 - 5.0 years
0 Lacs
Bangalore Urban, Karnataka, India
On-site
About Posium Posium is an AI startup developing state-of-the-art AI agents that automate end-to-end software testing. We are backed by top global investors, including Sequoia, and our team comprises talented individuals from Georgia Tech, IIT, Uber, and Microsoft. We operate as a fully distributed organization built around a high-performance culture and an inclusive, collaborative team. Location - Bangalore Experience - 3 to 5 years We are looking for a Senior Software Engineer - Backend. We are looking for engineers who work like owners as you will be tasked with leading the development of one of our 5 major components of the Posium platform. You will work with experienced engineers on a modern serverless stack built using Javascript (React and Node) and Golang with an in-house real-time data analytics engine on Postgres and Elasticsearch among others. Responsibilities · Develop new features and enhance existing features. · Design, code and manage automated test scripts, continuous builds and deployment. At Posium, you are responsible for testing the features you develop. · Self-document development processes. · Help your fellow DevOps and junior engineers working on our cloud infrastructure and other product components to ensure code quality and robust architecture. Requirements and Qualifications · BE/BTech in Computer Science and Engineering · Experience with backend development (at least 3 years) · Hands-on experience with Go/Java/Node · Understanding of GraphQL · Understanding of AWS and cloud-native technologies is preferred · Strong problem-solving, service architecture, and coding skills Show more Show less
Posted 1 week ago
10.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Title: Technical Lead Order process (PHP Developer) Job Description: We are looking for an experienced and passionate Tech Lead to join our e-commerce organization for Order Processing Domain. The ideal candidate should have a strong background in microservices API integrations, software development, and e-commerce. The role requires excellent leadership skills, a deep understanding of agile working processes, and the ability to communicate effectively with both technical and non-technical stakeholders. Key Responsibilities: Lead the design, development, and deployment of e-commerce solutions. Work closely with Enterprise solution architect and collaborate with cross-functional teams to understand business needs and translate them into technical requirements. Act as a technical advisor, providing guidance on architectural decisions and technical challenges within the Order Processing domain. Foster a culture of continuous improvement and innovation, staying updated on industry trends. Define OpenAPI specifications and ensure they align with company standards. Develop software using technologies such as PHP, Simphony, laravel, , Kubernetes, Docker, Terraform, GCP, Bamboo, MySQL,No SQL database,,Elasticsearch, RabbitMQ, JavaScript, Playwrite . Support and mentor a team of developers to ensure successful project delivery. Ensure compliance with coding standards, security guidelines, and regulatory requirements. Qualifications: Bachelor's degree in Computer Science or a related field. Master's degree preferred. A minimum of 10 years of experience in software architecture, with at least 3 years in an e-commerce environment. Proficiency in development languages such as PHP,Simphony, laravel, Node.JS, kubernetes, Docker, Terraform, GCP, Bamboo, MySQL, ElasticSearch, RabbitMQ, JavaScript, Playwrite Strong understanding of software architecture principles and design patterns. Strong communication skills, ability to interact with technical and non-technical stakeholders. Strong leadership, work ethics, problem solving skills and commitment to excellence. Ability to prioritize and manage multiple projects simultaneously. Proven experience leading and managing development teams with Agile framework. High level of English proficiency. Familiarity with the DevOps lifecycle and CI/CD pipelines. Join our dynamic team and contribute to building cutting-edge solutions that drive business success. Apply now to be part of our growing organization! Show more Show less
Posted 1 week ago
2.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Join the EDG team as a Full Stack Software Engineer. The EDG team is responsible for Improve consumer experience by implementing an enterprise device gateway to manage device health signal acquisition, centralize consumer consent, facilitate efficient health signal distribution, and empower UHC with connected insights across the health and wellness ecosystem. The team has a strong and integrated relationship with the product team based on strong collaboration, trust, and partnership. Goals for the team are focused on creating meaningful positive impact for our customers through clear and measurable metrics analysis. Primary Responsibilities Write high-quality, fault tolerant code; normally 70% Backend and 30% Front-end (though the exact ratio will depend on your interest) Build high-scale systems, libraries, frameworks and create test plans Monitor production systems and provide on-call support Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications BS in Computer Science, Engineering or a related technical role or equivalent experience 2+ years experience with JS libraries and frameworks, such as Angular, React or other 2+ years experience in Scala, Java, or other compiled language Preferred Qualifications Experience with web design Experience using RESTful APIs and asynchronous JS Experience in design and development Testing experience with Scala or Java Database and caching experience, SQL and NoSQL (Postgres, Elasticsearch, or MongoDB) Proven interest in learning Scala At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission. Show more Show less
Posted 1 week ago
0 years
0 Lacs
India
Remote
Data Scientist Associate Responsibilities As a selected intern, your daily tasks will include: Engaging in data science projects and analytics. Developing and implementing data models, AI, ML, deep learning, NLP, GenAI, LangChain, LLM, LLAMA, OpenAI, and GPT-based solutions. Managing data pipelines, ETL/ELT processes, and data warehousing. Utilizing Python and its libraries for advanced programming tasks. Handling data collection, management, cleaning, and transformation. Creating data visualizations using BI tools such as Power BI, Kibana, and Google Data Studio. Working with databases like MongoDB, Neo4j, Dgraph, and SQL. Leveraging cloud platforms, including GCP, AWS, Azure, Linode, and Heroku. Required Skills Python Flask Django MongoDB API Development Elasticsearch Machine Learning Artificial Intelligence Job Details Work Mode: Remote (Work From Home) Start Date: Immediate Duration: 6 months Stipend: ₹10,000 – ₹12,000 per month Industry: Information Technology & Services Employment Type: Probation of 6 Months followed by Full-time Position based on performance Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
Redefining Ads, E-Commerce and Creator Economy with AI, join the founding team of Renokon to make a positive impact on billions of lives. Role: Senior AI/ML Engineer Location: Kolkata (Onsite) CTC: ₹12 LPA - ₹22 LPA (Team Leader in 5 Years with ₹50 LPA+ CTC) CTC = 60% Cash + 40% Equity—structured for exponential upside as Renokon redefines AI-driven commerce. Equity vested over 4 years. It's a once in a lifetime opportunity to join Renokon's founding team. We are assembling a team of 7 world-class engineers to scale Renokon at lightning speed. Rupayan Das - Founder and CEO. Renokon is rewriting the rules of e-commerce and creator economy with AI. We're fixing the broken link between attention and action—seamlessly integrating AI-powered personalized voice-first sales agent, immersive shopping, and native monetization for creators. We’re looking for world-class Full Stack AI Engineer to join our team and help scale Renokon’s vision. If you're someone who thrives on building at scale, solving tough engineering challenges, and pushing the limits of AI-powered commerce, this is the place for you. Renokon's vision: To empower 200,000 Indian content creators to earn ₹10L/year by 2035. To generate $20B in annual products sale for our brand partners by 2035. Who are we looking for? 8+ years of hands-on experience. Comfortable working in Kolkata (onsite). Strong understanding of AI-driven recommendations, search intelligence, and real-time infrastructure. Passion for creator monetization, commerce, and revolutionizing digital ads. What You’ll Work On – Build the world's first voice-native sales agent for voice-based shopping. Optimize AI-powered search & recommendations with ElasticSearch + NLP models (BERT-based semantic search). Develop high-throughput ad bidding engines using XGBoost, DeepFM & RL-based bidding agents to maximize ad conversions. Build trust-driven commerce scoring models with Graph ML, Trust Rank & Behavioral Scoring (XGBoost) to ensure high-quality transactions. Develop creator scoring models using multi-factor ML models (virality, engagement, sales) to drive optimal brand-creator matching. Train NLP-powered content discovery systems leveraging GPT-4 Turbo & custom LLMs via Hugging Face for AI-driven recommendations. Ensure content moderation with CV + NLP models (HuggingFace + Perspective API) to maintain platform integrity. Develop LLM Co-Pilot fine-tuned on top creators, ad campaigns, and real-world commerce insights to assist brands in optimizing ads. Implement advanced A/B testing & experimentation using LaunchDarkly / Unleash to refine AI models based on user behavior. Optimize e-commerce personalization through Neural Collaborative Filtering (NCF) & DeepFM-based recommendation engines. Preferred Educational Qualification: Master’s degree or PhD in Computer Science, AI, Machine Learning, Data Science, or a related field (preferred but not mandatory) . Candidates with strong applied ML experience and contributions to open-source AI projects can be considered without a PhD. Why join us? Be part of building the trillion-dollar AI-driven commerce platform from scratch. Solve real-world challenges at scale—from creator monetization to ad conversion. Work alongside the founder, Rupayan Das and a highly skilled team, the backbone of Renokon. Shape the digital future in India. If you are a world-class engineer, you are extremely good at what you do and you’re ready to build something that will impact millions of people lives — join our founding team. Let’s build the future together. Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Description Who we are: At Kenvue, part of the Johnson & Johnson Family of Companies, we believe there is extraordinary power in everyday care. Built on over a century of heritage and propelled forward by science, our iconic brands—including NEUTROGENA®, AVEENO®, TYLENOL®, LISTERINE®, JOHNSON’S® and BAND-AID® —are category leaders trusted by millions of consumers who use our products to improve their daily lives. Our employees share a digital-first mindset, an approach to innovation grounded in deep human insights, and a commitment to continually earning a place for our products in consumers’ hearts and homes. What will you do: The Senior Engineer Kubernetes is a hands-on engineer responsible for designing, implementing, and managing Cloud Native Kubernetes-based platform ecosystem and solutions for organization. This includes developing and implementing containerization strategies, developer workflows, designing and deploying Kubernetes platform, and ensuring high availability and scalability of Kubernetes infrastructure aligned with modern GitOps practices. Key Responsibilities: Implement platform capabilities and containerization plan using Kubernetes, Docker, service mesh and other modern containerization tools and technologies. Design and collaborate with other engineering stakeholders in developing architecture patterns and templates for application runtime platform such as K8s Cluster topology, traffic shaping, API, CI CD, and observability aligned with DevSecOps principles. Automate Kubernetes infrastructure deployment and management using tools such as Terraform, Jenkins, Crossplane to develop self-service platform workflows. Serve as member of micro-services platform team to closely work with Security and Compliance organization to define controls. Develop self-service platform capabilities focused on developer workflows such as API, service mesh, external DNS, cert management and K8s life cycle management in general. Participate in a cross-functional IT Architecture group discussion that reviews design from an enterprise cloud platform perspective. Optimize Kubernetes platform infrastructure for high availability and scalability. What we are looking for Qualifications Bachelor’s Degree required, preferably in STEM field. 5+ years of progressive experience in a combination of development, design in areas of cloud computing. 3+ years of experience in developing cloud native platform capabilities based of Kubernetes (Preferred EKS and/or AKS). Strong Infrastructure as a Code (IaC) experience on public Cloud (AWS and/or Azure) Experience in working on a large scale, highly available, cloud native, multi-tenant, infrastructure platforms on public cloud, preferably in a consumer business Expertise in building platform using tools like Kubernetes, Istio, OpenShift, Linux, Helm, Terraform, CI/CD. Experience in working high scale, critically important products running across public clouds (AWS, Azure) and private data centers is a plus. Strong hand-on development experience with one or more of the following languages: Go, Scala, Java, Ruby, Python Prior experience on working in a team involved in re-architecting and migrating monolith applications to microservices will be a plus Prior experience of Observability, through tools such as Prometheus, Elasticsearch, Grafana, DataDog or Zipkin is a plus. Must have a solid understanding of Continuous Development and Deployment in AWS and/or Azure. Understanding of basic Linux kernel and window server operating system Experience in working with bash, PowerShell scripting. Must be results-driven, a quick learner, and a self-starter Cloud engineering experience is a plus. Qualifications Must be results-driven, a quick learner, and a self-starter Cloud engineering experience is a plus. Primary Location Asia Pacific-India-Karnataka-Bangalore Job Function Operations (IT)
Posted 1 week ago
0.0 - 5.0 years
0 Lacs
Chennai, Tamil Nadu
On-site
Comcast brings together the best in media and technology. We drive innovation to create the world's best entertainment and online experiences. As a Fortune 50 leader, we set the pace in a variety of innovative and fascinating businesses and create career opportunities across a wide range of locations and disciplines. We are at the forefront of change and move at an amazing pace, thanks to our remarkable people, who bring cutting-edge products and services to life for millions of customers every day. If you share in our passion for teamwork, our vision to revolutionize industries and our goal to lead the future in media and technology, we want you to fast-forward your career at Comcast. Job Summary Responsible for designing, building and overseeing the deployment and operation of technology architecture, solutions and software to capture, manage, store and utilize structured and unstructured data from internal and external sources. Establishes and builds processes and structures based on business and technical requirements to channel data from multiple inputs, route appropriately and store using any combination of distributed (cloud) structures, local databases, and other applicable storage forms as required. Develops technical tools and programming that leverage artificial intelligence, machine learning and big-data techniques to cleanse, organize and transform data and to maintain, defend and update data structures and integrity on an automated basis. Creates and establishes design standards and assurance processes for software, systems and applications development to ensure compatibility and operability of data connections, flows and storage requirements. Reviews internal and external business and product requirements for data operations and activity and suggests changes and upgrades to systems and storage to accommodate ongoing needs. Work with data modelers/analysts to understand the business problems they are trying to solve then create or augment data assets to feed their analysis. Integrates knowledge of business and functional priorities. Acts as a key contributor in a complex and crucial environment. May lead teams or projects and shares expertise. Job Description Position: Data Engineer 4 Experience: 8 years to 11.5 years Job Location: Chennai Tamil Nadu Job Description: Requirements Databases: Deep knowledge of relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra, Couchbase). ( MUST) Big Data Technologies: Experience with, Spark, Kafka, and other big data ecosystem tools. (NICE TO HAVE) Cloud Platforms: Experience with cloud services such as AWS, Azure, or Google Cloud Platform, with a particular focus on data engineering services . ( NICE TO HAVE ) Version Control: Experience with version control systems like Git. ( MUST) CI/CD: Knowledge of CI/CD pipelines for automating development and deployment processes. (MUST) Proficiency in Elasticsearch and experience managing large-scale clusters. ( MUST) Hands-on experience with containerization technologies like Docker and Kubernetes. (MUST docker) Strong programming skills in scripting languages such as Python, Bash, or similar. ( NICE to HAVE) Key Responsibilities Design, develop, and maintain scalable data pipelines and infrastructure. Ensure compliance with security regulations and implement advanced security measures to protect company data. Implement and manage CI/CD pipelines for data applications. Work with containerization technologies (Docker, Kubernetes) to deploy and manage data services. Optimize and manage Elasticsearch clusters for log ingestion , and tools such as Logstash, fluent.d , promtail, used to forward logs to Elastic Istance or other Log ingestion Tool . ( Loki+ Grafana). Collaborate with other departments (e.g., Data Science, IT, DevOps) to integrate data solutions with existing business systems. Optimize the performance of data pipelines and resolve data integrity and quality issues. Document data processes and architectures to ensure transparency and facilitate maintenance. Monitor industry trends and adopt best practices to continuously improve our data engineering solutions. Core Responsibilities Develops data structures and pipelines aligned to established standards and guidelines to organize, collect, standardize and transform data that helps generate insights and address reporting needs. Focuses on ensuring data quality during ingest, processing as well as final load to the target tables. Creates standard ingestion frameworks for structured and unstructured data as well as checking and reporting on the quality of the data being processed. Creates standard methods for end users / downstream applications to consume data including but not limited to database views, extracts and Application Programming Interfaces. Develops and maintains information systems (e.g., data warehouses, data lakes) including data access Application Programming Interfaces. Participates in the implementation of solutions via data architecture, data engineering, or data manipulation on both on-prem platforms like Kubernetes and Teradata as well as Cloud platforms like Databricks. Determines the appropriate storage platform across different on-prem (minIO and Teradata) and Cloud (AWS S3, Redshift) depending on the privacy, access and sensitivity requirements. Understands the data lineage from source to the final semantic layer along with the transformation rules applied to enable faster troubleshooting and impact analysis during changes. Collaborates with technology and platform management partners to optimize data sourcing and processing rules to ensure appropriate data quality as well as process optimization. Creates and establishes design standards and assurance processes for software, systems and applications development to ensure compatibility and operability of data connections, flows and storage requirements. Reviews internal and external business and product requirements for data operations and activity and suggests changes and upgrades to systems and storage to accommodate ongoing needs. Develops strategies for data acquisition, archive recovery, and database implementation. Manages data migrations/conversions and troubleshooting data processing issues. Understands the data sensitivity, customer data privacy rules and regulations and applies them consistently in all Information Lifecycle Management activities. Identifies and reacts to system notification and log to ensure quality standards for databases and applications. Solves abstract problems beyond single development language or situation by reusing data file and flags already set. Solves critical issues and shares knowledge such as trends, aggregate, quantity volume regarding specific data sources. Consistent exercise of independent judgment and discretion in matters of significance. Regular, consistent and punctual attendance. Must be able to work nights and weekends, variable schedule(s) as necessary. Other duties and responsibilities as assigned. Employees at all levels are expected to: Understand our Operating Principles; make them the guidelines for how you do your job. Own the customer experience - think and act in ways that put our customers first, give them seamless digital options at every touchpoint, and make them promoters of our products and services. Know your stuff - be enthusiastic learners, users and advocates of our game-changing technology, products and services, especially our digital tools and experiences. Win as a team - make big things happen by working together and being open to new ideas. Be an active part of the Net Promoter System - a way of working that brings more employee and customer feedback into the company - by joining huddles, making call backs and helping us elevate opportunities to do better for our customers. Drive results and growth. Respect and promote inclusion & diversity. Do what's right for each other, our customers, investors and our communities. Disclaimer:This information has been designed to indicate the general nature and level of work performed by employees in this role. It is not designed to contain or be interpreted as a comprehensive inventory of all duties, responsibilities and qualifications. Comcast is proud to be an equal opportunity workplace. We will consider all qualified applicants for employment without regard to race, color, religion, age, sex, sexual orientation, gender identity, national origin, disability, veteran status, genetic information, or any other basis protected by applicable law. Base pay is one part of the Total Rewards that Comcast provides to compensate and recognize employees for their work. Most sales positions are eligible for a Commission under the terms of an applicable plan, while most non-sales positions are eligible for a Bonus. Additionally, Comcast provides best-in-class Benefits to eligible employees. We believe that benefits should connect you to the support you need when it matters most, and should help you care for those who matter most. That’s why we provide an array of options, expert guidance and always-on tools, that are personalized to meet the needs of your reality – to help support you physically, financially and emotionally through the big milestones and in your everyday life. Please visit the compensation and benefits summary on our careers site for more details. Education Bachelor's Degree While possessing the stated degree is preferred, Comcast also may consider applicants who hold some combination of coursework and experience, or who have extensive related professional experience. Relevant Work Experience 7-10 Years
Posted 1 week ago
0.0 years
0 Lacs
Chennai, Tamil Nadu
Remote
IT Full-Time Job ID: DGC00686 Chennai, Tamil Nadu 5-10 Yrs ₹5.25 - ₹12 Yearly Job description Software Engineering Java Engineer Senior India Remote We are looking for a Senior Backend Software Engineer to join Intellias. As our Senior Backend Software Engineer: You will participate in the team s technical/architectural discussions and decisions. You will participate in the complete software development life cycle from discovery through coding, testing, deployment, and maintenance. Daily, you will learn and grow your skills, striving for mastery using state-of-the-art technologies and practices such as AWS, Microservices, Docker and much more! What project we have for you We have a dream: to change industries through the power of digital technology. With a team of top-notch engineers by your side, you will develop groundbreaking solutions at Intellias. Let s code the future together! What you will do Participate in solution investigation, estimations, planning, and alignment with other teams. Design, implement, deliver, and support backend solutions (restful web services) using micro-services architecture in Apache Camel integration framework. Promote and implement test automation on the application level (e.g., unit tests, integration tests) and work closely with the Test Engineer. Work closely with the team in an agile and collaborative environment. This will involve code reviews, knowledge sharing, and incident coordination. Participate in the complete software development life cycle from discovery through coding, testing, deployment, and maintenance. Maintain created applications during the UAT phase after development. Deploy applications on the cloud using technologies such as Docker, Kubernetes, AWS and Terraform. What you need for this Qualifications: Must have 5+ years of experience with Java and Spring framework You can understand the architecture landscape and technically investigate and implement new features independently. Being responsible for the quality of the solution you deliver is natural for you. You have experience with unit testing and Test-Driven Development. Empathetic and able to quickly build relationships. Good English verbal and written communication skills. Experience working within Agile practices and knowledge of Agile values & principles. Experience working with Microservices. Ready to work with on calls duties approximately 1 week on call every 8 weeks. Nice to have: Experience with DevOps tools and practices (container orchestration, CI, monitoring and alerting, AWS & Kubernetes) Experience in the e-commerce domain. Technologies / frameworks / practices: Must have Java 21 (required 11+), Spring framework (Boot) Experience with NoSQL DB Essential experience with AWS Cloud GIT Microservices Testing (jUnit 5) Scrum, Code Review Nice to have Apache Camel ElasticSearch, MongoDB AWS (DocumentDB, SQS, SNS, Secret Manager, IAM, S3) Grafana, ELK stack, Prometheus Terraform Kubernetes, Docker Testing (TestContainers) CI/CD with Jenkins pipeline
Posted 1 week ago
0.0 - 5.0 years
0 Lacs
Chennai, Tamil Nadu
On-site
Overview Our analysts transform data into meaningful insights that drive strategic decision making. They analyze trends, interpret data, and discover opportunities. Working cross-functionally, they craft narratives from the numbers - directly contributing to our success. Their work influences key business decisions and shape the direction of Comcast. Success Profile What makes a successful Data Engineer 4 at Comcast? Check out these top traits and explore role-specific skills in the job description below. Good Listener Problem Solver Organized Collaborative Perceptive Analytical Benefits We’re proud to offer comprehensive benefits to help support you physically, financially and emotionally through the big milestones and in your everyday life. Paid Time off We know how important it can be to spend time away from work to relax, recover from illness, or take time to care for others needs. Physical Wellbeing We offer a range of benefits and support programs to ensure that you and your loved ones get the care you need. Financial Wellbeing These benefits give you personalized support designed entirely around your unique needs today and for the future. Emotional Wellbeing No matter how you’re feeling or what you’re dealing with, there are benefits to help when you need it, in the way that works for you. Life Events + Family Support Benefits that support you no matter where you are in life’s journey. Data Engineer 4 Location Chennai, India Req ID R412866 Job Type Full Time Category Analytics Date posted 06/10/2025 Comcast brings together the best in media and technology. We drive innovation to create the world's best entertainment and online experiences. As a Fortune 50 leader, we set the pace in a variety of innovative and fascinating businesses and create career opportunities across a wide range of locations and disciplines. We are at the forefront of change and move at an amazing pace, thanks to our remarkable people, who bring cutting-edge products and services to life for millions of customers every day. If you share in our passion for teamwork, our vision to revolutionize industries and our goal to lead the future in media and technology, we want you to fast-forward your career at Comcast. Job Summary Responsible for designing, building and overseeing the deployment and operation of technology architecture, solutions and software to capture, manage, store and utilize structured and unstructured data from internal and external sources. Establishes and builds processes and structures based on business and technical requirements to channel data from multiple inputs, route appropriately and store using any combination of distributed (cloud) structures, local databases, and other applicable storage forms as required. Develops technical tools and programming that leverage artificial intelligence, machine learning and big-data techniques to cleanse, organize and transform data and to maintain, defend and update data structures and integrity on an automated basis. Creates and establishes design standards and assurance processes for software, systems and applications development to ensure compatibility and operability of data connections, flows and storage requirements. Reviews internal and external business and product requirements for data operations and activity and suggests changes and upgrades to systems and storage to accommodate ongoing needs. Work with data modelers/analysts to understand the business problems they are trying to solve then create or augment data assets to feed their analysis. Integrates knowledge of business and functional priorities. Acts as a key contributor in a complex and crucial environment. May lead teams or projects and shares expertise. Job Description Position: Data Engineer 4 Experience: 8 years to 11.5 years Job Location: Chennai Tamil Nadu Job Description: Requirements Databases: Deep knowledge of relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra, Couchbase). ( MUST) Big Data Technologies: Experience with, Spark, Kafka, and other big data ecosystem tools. (NICE TO HAVE) Cloud Platforms: Experience with cloud services such as AWS, Azure, or Google Cloud Platform, with a particular focus on data engineering services . ( NICE TO HAVE ) Version Control: Experience with version control systems like Git. ( MUST) CI/CD: Knowledge of CI/CD pipelines for automating development and deployment processes. (MUST) Proficiency in Elasticsearch and experience managing large-scale clusters. ( MUST) Hands-on experience with containerization technologies like Docker and Kubernetes. (MUST docker) Strong programming skills in scripting languages such as Python, Bash, or similar. ( NICE to HAVE) Key Responsibilities Design, develop, and maintain scalable data pipelines and infrastructure. Ensure compliance with security regulations and implement advanced security measures to protect company data. Implement and manage CI/CD pipelines for data applications. Work with containerization technologies (Docker, Kubernetes) to deploy and manage data services. Optimize and manage Elasticsearch clusters for log ingestion , and tools such as Logstash, fluent.d , promtail, used to forward logs to Elastic Istance or other Log ingestion Tool . ( Loki+ Grafana). Collaborate with other departments (e.g., Data Science, IT, DevOps) to integrate data solutions with existing business systems. Optimize the performance of data pipelines and resolve data integrity and quality issues. Document data processes and architectures to ensure transparency and facilitate maintenance. Monitor industry trends and adopt best practices to continuously improve our data engineering solutions. Core Responsibilities Develops data structures and pipelines aligned to established standards and guidelines to organize, collect, standardize and transform data that helps generate insights and address reporting needs. Focuses on ensuring data quality during ingest, processing as well as final load to the target tables. Creates standard ingestion frameworks for structured and unstructured data as well as checking and reporting on the quality of the data being processed. Creates standard methods for end users / downstream applications to consume data including but not limited to database views, extracts and Application Programming Interfaces. Develops and maintains information systems (e.g., data warehouses, data lakes) including data access Application Programming Interfaces. Participates in the implementation of solutions via data architecture, data engineering, or data manipulation on both on-prem platforms like Kubernetes and Teradata as well as Cloud platforms like Databricks. Determines the appropriate storage platform across different on-prem (minIO and Teradata) and Cloud (AWS S3, Redshift) depending on the privacy, access and sensitivity requirements. Understands the data lineage from source to the final semantic layer along with the transformation rules applied to enable faster troubleshooting and impact analysis during changes. Collaborates with technology and platform management partners to optimize data sourcing and processing rules to ensure appropriate data quality as well as process optimization. Creates and establishes design standards and assurance processes for software, systems and applications development to ensure compatibility and operability of data connections, flows and storage requirements. Reviews internal and external business and product requirements for data operations and activity and suggests changes and upgrades to systems and storage to accommodate ongoing needs. Develops strategies for data acquisition, archive recovery, and database implementation. Manages data migrations/conversions and troubleshooting data processing issues. Understands the data sensitivity, customer data privacy rules and regulations and applies them consistently in all Information Lifecycle Management activities. Identifies and reacts to system notification and log to ensure quality standards for databases and applications. Solves abstract problems beyond single development language or situation by reusing data file and flags already set. Solves critical issues and shares knowledge such as trends, aggregate, quantity volume regarding specific data sources. Consistent exercise of independent judgment and discretion in matters of significance. Regular, consistent and punctual attendance. Must be able to work nights and weekends, variable schedule(s) as necessary. Other duties and responsibilities as assigned. Employees at all levels are expected to: Understand our Operating Principles; make them the guidelines for how you do your job. Own the customer experience - think and act in ways that put our customers first, give them seamless digital options at every touchpoint, and make them promoters of our products and services. Know your stuff - be enthusiastic learners, users and advocates of our game-changing technology, products and services, especially our digital tools and experiences. Win as a team - make big things happen by working together and being open to new ideas. Be an active part of the Net Promoter System - a way of working that brings more employee and customer feedback into the company - by joining huddles, making call backs and helping us elevate opportunities to do better for our customers. Drive results and growth. Respect and promote inclusion & diversity. Do what's right for each other, our customers, investors and our communities. Disclaimer:This information has been designed to indicate the general nature and level of work performed by employees in this role. It is not designed to contain or be interpreted as a comprehensive inventory of all duties, responsibilities and qualifications. Comcast is proud to be an equal opportunity workplace. We will consider all qualified applicants for employment without regard to race, color, religion, age, sex, sexual orientation, gender identity, national origin, disability, veteran status, genetic information, or any other basis protected by applicable law. Base pay is one part of the Total Rewards that Comcast provides to compensate and recognize employees for their work. Most sales positions are eligible for a Commission under the terms of an applicable plan, while most non-sales positions are eligible for a Bonus. Additionally, Comcast provides best-in-class Benefits to eligible employees. We believe that benefits should connect you to the support you need when it matters most, and should help you care for those who matter most. That’s why we provide an array of options, expert guidance and always-on tools, that are personalized to meet the needs of your reality – to help support you physically, financially and emotionally through the big milestones and in your everyday life. Please visit the compensation and benefits summary on our careers site for more details. Education Bachelor's Degree While possessing the stated degree is preferred, Comcast also may consider applicants who hold some combination of coursework and experience, or who have extensive related professional experience. Relevant Work Experience 7-10 Years
Posted 1 week ago
0.0 years
0 Lacs
Chennai, Tamil Nadu
On-site
IT Full-Time Job ID: DGC00684 Chennai, Tamil Nadu 3-5 Yrs ₹3.5 - ₹6.25 Yearly Job description Job Skills: Must have skills : Experience in designing and developing Web Apps using Java Spring Boot development, Core Java concepts around dependent technologies Expertise in building microservices Good understanding of EMS/NMS OOPs Concepts Design Patterns Clean understanding of Classes and Interfaces Generics, JVM and Memory Management, Caching Data into memory, Service Oriented Architecture, Concurrency (multithreading) Messaging Techniques, Complex Event Processing, Storage and Database Technologies (MongoDB, MySQL, ElasticSearch) Exposure to compiler like IDE OR ECLIPSE Value Add: Working experience with Dockers K8s Working knowledge on OSGI Knowledge on kafka, python, and ELK Good communication skills Scrum Master Problem solving skills You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey.
Posted 1 week ago
0.0 - 5.0 years
0 Lacs
Chennai, Tamil Nadu
Remote
IT Full-Time Job ID: DGC00689 Chennai, Tamil Nadu 3-6 Yrs ₹3.5 - ₹07 Yearly Job description Experience: 1.5 to 5 years Location: Remote Employment Type: Full-Time Job Summary: We are looking for talented Full Stack Python Developers (Junior and Senior levels) who are passionate about building scalable web applications. You will work closely with cross-functional teams to design, develop, and deliver robust enterprise solutions using modern technologies such as Python, ReactJS, AWS, and more. Responsibilities : Design, develop, test, deploy, and maintain scalable enterprise web applications. Build responsive front-end applications using ReactJS . Develop robust backend services and RESTful APIs using Python (Django / Flask / FastAPI). Work on Microservices architecture and cloud-based platforms such as AWS . Utilize Docker and Terraform for DevOps activities and infrastructure management. Participate in code reviews and Agile Scrum practices. (Senior Role) Architect solutions and ensure adherence to coding standards. (Senior Role) Mentor junior developers and contribute to technical leadership. Requirements : For Junior Developer (2 to 3 years): 2 to 3 years of experience with Python (Django / Flask / FastAPI). Experience contributing to Microservices architecture. Proficient in ReactJS , JavaScript/jQuery , CSS , HTML5 . Familiarity with Postgres , DynamoDB , SQL queries. Exposure to AWS , Docker , Terraform is a plus. Strong problem-solving and collaboration skills. Eagerness to learn and work in a fast-paced environment. For Senior Developer (3 to 5 years): 3 to 5 years of hands-on experience with Python (Django / Flask / FastAPI). Proven experience building and architecting Microservices . Proficient in ReactJS , JavaScript/jQuery , CSS , HTML5 . Strong experience with Postgres , DynamoDB , SQL queries. Hands-on experience with Terraform , Docker , AWS services. Familiarity with AWS S3 , ElasticSearch is a plus. Strong problem-solving, leadership, and communication skills. Ability to mentor junior team members and drive best practices.
Posted 1 week ago
6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Skills: AWS, Docker, Jenkins, Python, REST APIs, NoSQL, Elasticsearch, Object-Oriented Programming, Job Overview We are seeking a Senior Python Developer to join our team on a full-time, hybrid basis. This role requires a minimum of 6 years of work experience in Python development. The successful candidate will play a crucial role in advancing our software solutions and consulting services, contributing to the growth and success of Chiselon Technologies. Qualifications And Skills Proficient in Python development with at least 6 years of hands-on experience in software programming and solutions development. Must possess expertise in Docker, Jenkins, and Python (Mandatory skill) for continuous integration and microservices deployment. Comprehensive understanding of AWS cloud services and their applications in software development and deployment. Excellent skills in creating and consuming RESTful APIs to ensure seamless integration and communication between services. Strong knowledge of NoSQL databases for handling large datasets and providing quick access to the stored data. Experience with Elasticsearch for implementing powerful search and analytical features in applications. Solid command of Object-Oriented Programming principles to develop scalable and optimized software solutions. Effective problem-solving skills with the ability to analyze complex data and provide innovative solutions. Roles And Responsibilities Develop and maintain high-quality Python applications that align with business goals and technical requirements. Collaborate with cross-functional teams to define, design, and ship new features and improvements. Ensure the performance, quality, and responsiveness of applications through regular testing and code reviews. Manage and oversee the complete software development lifecycle, from requirements gathering to deployment and maintenance. Utilize Docker, Jenkins, and other continuous integration tools to streamline development processes and optimize performance. Implement AWS solutions for hosting and scaling applications effectively and cost-efficiently. Build and enhance RESTful APIs to facilitate seamless interaction between internal and external systems. Troubleshoot complex issues related to software functionality, performance, and reliability, ensuring timely resolutions. Show more Show less
Posted 1 week ago
6.0 years
0 Lacs
Anantapur, Andhra Pradesh, India
On-site
Skills: AWS, Docker, Jenkins, Python, REST APIs, NoSQL, Elasticsearch, Object-Oriented Programming, Job Overview We are seeking a Senior Python Developer to join our team on a full-time, hybrid basis. This role requires a minimum of 6 years of work experience in Python development. The successful candidate will play a crucial role in advancing our software solutions and consulting services, contributing to the growth and success of Chiselon Technologies. Qualifications And Skills Proficient in Python development with at least 6 years of hands-on experience in software programming and solutions development. Must possess expertise in Docker, Jenkins, and Python (Mandatory skill) for continuous integration and microservices deployment. Comprehensive understanding of AWS cloud services and their applications in software development and deployment. Excellent skills in creating and consuming RESTful APIs to ensure seamless integration and communication between services. Strong knowledge of NoSQL databases for handling large datasets and providing quick access to the stored data. Experience with Elasticsearch for implementing powerful search and analytical features in applications. Solid command of Object-Oriented Programming principles to develop scalable and optimized software solutions. Effective problem-solving skills with the ability to analyze complex data and provide innovative solutions. Roles And Responsibilities Develop and maintain high-quality Python applications that align with business goals and technical requirements. Collaborate with cross-functional teams to define, design, and ship new features and improvements. Ensure the performance, quality, and responsiveness of applications through regular testing and code reviews. Manage and oversee the complete software development lifecycle, from requirements gathering to deployment and maintenance. Utilize Docker, Jenkins, and other continuous integration tools to streamline development processes and optimize performance. Implement AWS solutions for hosting and scaling applications effectively and cost-efficiently. Build and enhance RESTful APIs to facilitate seamless interaction between internal and external systems. Troubleshoot complex issues related to software functionality, performance, and reliability, ensuring timely resolutions. Show more Show less
Posted 1 week ago
4.0 years
0 Lacs
Gurugram, Haryana, India
On-site
WHAT MAKES US A GREAT PLACE TO WORK We are proud to be consistently recognized as one of the world’s best places to work. We are currently the #1 ranked consulting firm on Glassdoor’s Best Places to Work list and have maintained a spot in the top four on Glassdoor’s list since its founding in 2009. Extraordinary teams are at the heart of our business strategy, but these don’t happen by chance. They require intentional focus on bringing together a broad set of backgrounds, cultures, experiences, perspectives, and skills in a supportive and inclusive work environment. We hire people with exceptional talent and create an environment in which every individual can thrive professionally and personally. WHO YOU’LL WORK WITH You’ll join our Engineering experts within the AI, Insights & Solutions team. This team is part of Bain’s digital capabilities practice, which includes experts in analytics, engineering, product management, and design. In this multidisciplinary environment, you'll leverage deep technical expertise with business acumen to help clients tackle their most transformative challenges. You’ll work on integrated teams alongside our general consultants and clients to develop data-driven strategies and innovative solutions. Together, we create human-centric solutions that harness the power of data and artificial intelligence to drive competitive advantage for our clients. Our collaborative and supportive work environment fosters creativity and continuous learning, enabling us to consistently deliver exceptional results. We are committed to building a diverse and inclusive team and encourage candidates of all backgrounds to apply. Bain offers comprehensive benefits and flexible policies that are designed to support you, so you can thrive personally and professionally. WHAT YOU’LL DO Design, develop, and maintain cloud-based AI applications, leveraging a full-stack technology stack to deliver high-quality, scalable, and secure solutions. Collaborate with cross-functional teams, including product managers, data scientists, and other engineers, to define and implement analytics features and functionality that meet business requirements and user needs. Utilize Kubernetes and containerization technologies to deploy, manage, and scale analytics applications in cloud environments, ensuring optimal performance and availability. Develop and maintain APIs and microservices to expose analytics functionality to internal and external consumers, adhering to best practices for API design and documentation. Implement robust security measures to protect sensitive data and ensure compliance with data privacy regulations and organizational policies. Continuously monitor and troubleshoot application performance, identifying and resolving issues that impact system reliability, latency, and user experience. Participate in code reviews and contribute to the establishment and enforcement of coding standards and best practices to ensure high-quality, maintainable code. Champion best demonstrated practices in software engineering, and share learnings with team members in AIS about theoretical and technical developments in software engineering Stay current with emerging trends and technologies in cloud computing, data analytics, and software engineering, and proactively identify opportunities to enhance the capabilities of the analytics platform. Collaborate with DevOps and infrastructure teams to automate deployment and release processes, implement CI/CD pipelines, and optimize the development workflow for the analytics engineering team. Collaborate closely with and influence business consulting staff and leaders as part of multi-disciplinary teams to assess opportunities and develop analytics solutions for Bain clients across a variety of sectors. Influence, educate and directly support the analytics application engineering capabilities of our clients There may be significant travel requirements (up to 30%) due to the international nature of our business. ABOUT YOU Required Bachelors/Master’s degree in Computer Science, Engineering, or a related technical field is a plus. 4+ years of professional hands-on experience in web development, programming languages, version control, software design pattern, infrastructure and deployment, integration and unit testing implementation Experience with server-side technologies such as, Django, Flask, Fast API Experience with client-side technologies such as React, Angular, Vue.js, HTML and CSS Experience with cloud platforms and services (AWS, Azure, GCP) via Terraform Automation (good to have) Working knowledge (3+ years) of Python Demonstrated interest with LLMs, Prompt engineering, Langchain Exposure to software architecture, DB design, scalability and SQL Experience with RDBMS (e.g. MySQL, PostgreSQL, SQLite, SQL Server, Oracle) and NoSQLs databases (e.g. MongoDB, Cassandra, Elasticsearch) Exposure to working in accordance with DevSecOps principles, and familiarity with industry deployment best practices using CI/CD tools, MLOps, LLMOps and infrastructure as code (Jenkins, Docker, Kubernetes, and Terraform) Strong knowledge in designing API interfaces Knowledge of data architecture, database schema design and database scalability Strong interpersonal and communication skills, including the ability to explain and discuss complex engineering technicalities with colleagues and clients from other disciplines at their level of cognition Curiosity, proactivity and critical thinking Strong computer science fundaments in data structures, algorithms, automated testing, object-oriented programming, performance complexity, and implications of computer architecture on software performance. Experience working according to agile principles Location: Bengaluru, New Delhi, Mumbai (Hybrid) Show more Show less
Posted 1 week ago
55.0 years
0 Lacs
Pune, Maharashtra, India
Remote
Capgemini is actively seeking a skilled Elasticsearch Developer to join our team! Responsibilities: Maintain and optimize existing applications built with Spring Boot and Elasticsearch Monitor and troubleshoot application performance issues, ensuring high availability and reliability Implement and manage Elasticsearch clusters, including indexing, querying, and data management Develop and maintain RESTful APIs using Spring Boot to interact with Elasticsearch Collaborate with development teams to integrate new features and enhancements Perform regular updates and patches to ensure security and compliance Document processes, configurations, and troubleshooting steps Excellent problem-solving skills and attention to detail Provide support and guidance to other team members on Elasticsearch and Spring Boot best practices Requirements Skills required: Elasticsearch - in Depth knowledge Logstash - Pipeline creation Spring boot (Java - How to integrate with Elastic search and do query) AWS (Hosting application in EC2, Lambda, Cloud watch) Able to understand different application Below skill are Good to have AWS - Step functions Python IBM ACE Unix script CI/CD pipelines and DevOps Benefits Competitive compensation and benefits package: Competitive salary and performance-based bonuses Comprehensive benefits package Career development and training opportunities Flexible work arrangements (remote and/or office-based) Dynamic and inclusive work culture within a globally renowned group Private Health Insurance Pension Plan Paid Time Off Training & Development Performance Bonus Note: Benefits differ based on employee level. About Capgemini Capgemini is a global leader in partnering with companies to transform and manage their business by harnessing the power of technology. The Group is guided everyday by its purpose of unleashing human energy through technology for an inclusive and sustainable future. It is a responsible and diverse organization of over 340,000 team members in more than 50 countries. With its strong 55-year heritage and deep industry expertise, Capgemini is trusted by its clients to address the entire breadth of their business needs, from strategy and design to operations, fueled by the fast evolving and innovative world of cloud, data, AI, connectivity, software, digital engineering and platforms. The Group €22.5 billion in revenues in 2023. https://www.capgemini.com/us-en/about-us/who-we-are/ Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Anupgarh, Rajasthan, India
Remote
Capital Markets Gateway (CMG) is a fintech firm that streamlines equity capital markets (ECM), connecting investors and underwriters. Launched in 2017, CMG provides integrated ECM data, analytics, and workflow efficiencies, enabling better decision-making for nearly 150 buy-side firms ($40T AUM) and 20 global investment banks (Goldman Sachs, J.P Morgan, Barclays etc) As a SDET L2 you will play a pivotal role in enhancing our test infrastructure and tooling, creating new frameworks and systems that facilitate automated testing processes, leading to more efficient issue identification and robust solution delivery. You will work closely with product managers, designers, and software engineers to ensure our products are highly testable, contributing to the development and maintenance of testing frameworks and strategies. You will bring to the team industry best practices around functional and regression testing and ensure high-quality and high-velocity product launches. Apply directly on the original site at Get on Board. The Role Key Responsibilities Design, build, and maintain scalable test infrastructure, automation frameworks, and developer-facing tools to accelerate testing and improve product reliability Collaborate closely with Product Managers, Developers, and Designers to ensure testability is built into features and system architecture from the start Develop strategies for integrating automated tests deeply into CI/CD pipelines, enabling fast and reliable feedback loops across all stages of development Create reusable tooling and libraries that allow engineering teams to easily write, run, and maintain automated tests for APIs, services, and full-stack applications Proactively identify gaps in existing testing strategies and design new systems to improve test coverage, efficiency, and scalability across the platform Analyze and debug complex failures in automated test systems, differentiating between test infrastructure issues and product defects Champion a quality engineering culture across the organization, mentoring developers on writing effective tests, improving code testability, and leveraging automation tooling Drive continuous improvement of QA automation tooling, test data management solutions, and environment reliability to support rapid product development at scale Leverage Large Language Models (LLMs) to design and implement intelligent, AI-driven workflows into core processes, enabling smarter, faster, and more effective and efficient automation workflows Our Tech Stack Playwright, Jest, xUnit Docker, Kubernetes, Helm GitHub Actions, Harness, Terraform, GitOps Microsoft Azure Cloud Datadog, Grafana, OpenTelemetry Postgres, Elasticsearch, Redis Asp.Net REST and GraphQL back-ends React with TypeScript front-ends React-Native Mobile App What We're Looking For English level - C1 or C2 Proven experience as a Software Developer in Test 2 (SDET2) or similar role 5+ years of experience in JavaScript/TypeScript development or test automation Expert in automated testing and modern testing frameworks (e.g., Playwright, Jest, xUnit) Deep understanding of full-stack web application architecture (frontend to production) Structured, detail-oriented thinker with awareness of broader system goals Quick learner, self-motivated, and comfortable working with minimal supervision Strong analytical, problem-solving, and communication skills (written and verbal) Enjoys team collaboration, continuous learning, and building scalable solutions Data-driven mindset with a preference for experimentation and iterative improvement Interest or experience with integrating Large Language Models (LLMs) into development or testing workflows What We Offer 2 year+ contract 15 business days of vacation Tech courses and conferences Top-of-the-line MacBook Fully remote working environment Flexible working hours GETONBRD Job ID: 54120 Flexible hours Flexible schedule and freedom for attending family needs or personal errands. Computer provided Capital Markets Gateway provides a computer for your work. Informal dress code No dress code is enforced. Remote work policy Locally remote only Position is 100% remote, but candidates must reside in Chile, Uruguay, Mexico, Peru, Colombia, El Salvador, Argentina, Bolivia, Honduras, Paraguay, Panama, Brazil or Ecuador. Show more Show less
Posted 1 week ago
0 years
0 Lacs
Kolkata, West Bengal, India
On-site
Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities As an Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Preferred Education Master's Degree Required Technical And Professional Expertise Proof of Concept (POC) Development: Develop POCs to validate and showcase the feasibility and effectiveness of the proposed AI solutions. Help in showcasing the ability of Gen AI code assistant to refactor/rewrite and document code from one language to another Document solution architectures, design decisions, implementation details, and lessons learned. Stay up to date with the latest trends and advancements in AI, foundation models, and large language models. Evaluate emerging technologies, tools, and frameworks to assess their potential impact on solution design and implementation Preferred Technical And Professional Experience Experience and working knowledge in COBOL & JAVA would be preferred Having experience in Code generation, code matching & code translation leveraging LLM capabilities would be a Big plus Demonstrate a growth mindset to understand clients' business processes and challenges Show more Show less
Posted 1 week ago
0 years
0 Lacs
Kolkata, West Bengal, India
On-site
Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology Your Role And Responsibilities As an Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In This Role, Your Responsibilities May Include Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviour’s. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Preferred Education Master's Degree Required Technical And Professional Expertise Proof of Concept (POC) Development: Develop POCs to validate and showcase the feasibility and effectiveness of the proposed AI solutions. Help in showcasing the ability of Gen AI code assistant to refactor/rewrite and document code from one language to another Document solution architectures, design decisions, implementation details, and lessons learned. Stay up to date with the latest trends and advancements in AI, foundation models, and large language models. Evaluate emerging technologies, tools, and frameworks to assess their potential impact on solution design and implementation Preferred Technical And Professional Experience Experience and working knowledge in COBOL & JAVA would be preferred Having experience in Code generation, code matching & code translation leveraging LLM capabilities would be a Big plus Demonstrate a growth mindset to understand clients' business processes and challenges Show more Show less
Posted 1 week ago
0 years
0 Lacs
Kolkata, West Bengal, India
On-site
Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities As an Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Preferred Education Master's Degree Required Technical And Professional Expertise Proof of Concept (POC) Development: Develop POCs to validate and showcase the feasibility and effectiveness of the proposed AI solutions. Help in showcasing the ability of Gen AI code assistant to refactor/rewrite and document code from one language to another Document solution architectures, design decisions, implementation details, and lessons learned. Stay up to date with the latest trends and advancements in AI, foundation models, and large language models. Evaluate emerging technologies, tools, and frameworks to assess their potential impact on solution design and implementation Preferred Technical And Professional Experience Experience and working knowledge in COBOL & JAVA would be preferred Having experience in Code generation, code matching & code translation leveraging LLM capabilities would be a Big plus Demonstrate a growth mindset to understand clients' business processes and challenges Show more Show less
Posted 1 week ago
5.0 - 8.0 years
7 - 10 Lacs
Pune
Work from Office
Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. At least 3 years of experience in Java/J2EE, REST API, RDBMS At least 2 years of work experience in ReactJs. Ready for Individual contributor role and have done something similar in last 6 months. Fair understanding of Microservices, Cloud (Azure preferred) Practical knowledge of Object-Oriented Programming concepts and design patterns. Experience in implementation of Microservices, Service-oriented-architecture and multi-tier application platforms Good knowledge of JPA and SQL (preferable Oracle SQL) Experience working with RESTful Web Services Hands-on experience in tracing applications in distributed/microservices environment with usage of modern tools (Grafana, Prometheus, Splunk, Zipkin, Elasticsearch, Kibana, Logstash or similar) Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Deliver No Performance Parameter Measure 1 Process No. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory Skills: .NET.
Posted 1 week ago
6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Role**: Python,AWS,Terraform Required Technical Skill Set: Python Full stack Developer Desired Experience Range: 06 - 08 yrs Notice Period: Immediate to 90Days only Location of Requirement: Hyderabad We are currently planning to do a Walk in Interview on 14 th June 2025 (Saturday) Date – 14 th June 2025 (Saturday) Venue - Tata Consultancy Services Synergy Park (Non-SEZ) Campus, C9X3+FH8, TCS Synergy park, Indian Immunologicals Colony, Gachibowli, Hyderabad, Telangana 500032 Job Description: Primary Skill Frontend o 6+ years of overall experience with proficiency in React (2+ years), Typescript (1+ year), React hooks (1+ year) o Experience with ESlint, CSS in JS styling (preferably Emotion), state management (preferably Redux), and JavaScript bundlers such as Webpack o Experience with integrating with RESTful APIs or other web services Backend o Expertise with Python (3+ years, preferably Python3) o Proficiency with a Python web framework (2+ years, preferably flask and FastAPI) o Experience with a Python linter (preferably flake8), graph databases (preferably Neo4j), a package manager (preferably pip), Elasticsearch, and Airflow o Experience with developing microservices, RESTful APIs or other web services o Experience with Database design and management, including NoSQL/RDBMS tradeoffs Show more Show less
Posted 1 week ago
3.0 - 4.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job purpose: Need to work as a Technology or Functional Consultant in FinCrime solutions modernisation and transformation projects Should exhibit understanding of financial services during the client discussions and be able to articulate the client requirements into tech specs Contribute as team player in a team of consultants to be able to deliver large technology programs Work Experience Requirements Understand high-level business requirements and relate them to appropriate AML / FinCrime product capabilities Define and validate customisation needs for AML products as per client requirements. Review client processes and workflows and make recommendations to the client to maximise benefits from the AML Product. Show in-depth knowledge on best banking practices and AML product modules Prior experience in one of more COTS such as NetReveal , Norkom, Actimize, SAS AML VI/VIA, fircosoft or Quantexa Your client responsibilities: Need to work as a Technical Business Systems Analyst in one or more FinCrime projects. Interface and communicate with the onsite coordinators Completion of assigned tasks on time and regular status reporting to the lead Regular status reporting to the Manager and onsite coordinators Interface with the customer representatives as and when needed Willing to travel to the customers locations on need basis Mandatory skills: Technical: Expert in the following NetReveal modules: Scenario Manager Configuration, Application Builder, Base Platform, Workflow Configurator, Services Manager, Batch bridge, Scheduling Configuration, Command and Control, AML module, Expert in Velocity template. NetReveal Optimization module, Multi-entity and mutli-currency platform, Cloud platform, REST API development using Java. CI/CD technologies (BitBucket, Jenkins, Nexus, Serena). Container Technologies such as Docker, Kubernetes. NetReveal v7.4 or above, Proficient in Oracle SQL, PL/SQL, Websphere Application Server Experience in Agile Methodology SQL and Understanding of Bigdata tech such as Spark, Hadoop, or Elasticsearch Scripting/ Programming: At least one programming/scripting language amongst Python, Java or Unix Shell Script Experience in product migration, implementation - preferably been part of at least 1 AML implementations Act as the Subject Matter Expert (SME) and possess an excellent functional/operational knowledge of the activities performed by the various teams Should Posses high-level understanding of infrastructure designs, data model and application/business architecture. Functional : Thorough knowledge of the AML/CTF transactions monitoring, KYC, Sanctions process Thorough knowledge on Transaction monitoring and scenarios Should have developed one or more modules worked on KYC - know your customer, CDD- customer due diligence, EDD - enhanced due diligence, sanction screening, PE - politically exposed person, adverse media screening, TM- transaction monitoring, CM- Case Management. Thorough knowledge of case management workflows Preferred Work Location: This position offers flexibility to work from any EY GDS office in India Education And Experience – Mandatory MBA/ MCA/ BE/ BTech or equivalent with banking industry experience of 3 to 4 years EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Bengaluru East, Karnataka, India
On-site
Jobs 03/18/2020 Carmatec is looking for passionate DevOps Engineers to be a part of our InstaCarma team. Not only will you have the chance to make your mark as an established DevOps Engineer, but you will also get to work and interact with seasoned professionals deeply committed to revolutionize the Cloud scenario. Job Responsibilities Work on Infrastructure provisioning/configuration management too ls. We use Packer, Terraform and Chef. Develop automation tools/scripts. We use Bash/Python/Ruby Responsible for Continuous integration and artefact management. We use Jenkins and Artifactory Setup automated deployment pipelines for microservices running as Docker containers. Setup monitoring, alerting and metrics scraping for java/scala/play applications using Prometheus and Graylog2 integrated with PagerDuty and Hipchat for alerting,reporting and monitoring. Will be doing on-call Production support an d related Incident Management, reporting & Postmortem. Create runbooks, wikis for incidents, troubleshooting performed etc. Be a proactive member of your team by sharing knowledge. Resource scheduling,orchestration using Mesos/Marathon Work closely with development teams to ensure that platforms are designed with operability in mind Function well in a fast-paced, rapidly changing environment. Required Skills A basic understanding of DevOps tools and automation framework Outstanding organization, documentation, and communication skills. Must be skilled in Linux System Administration (Ubuntu/Centos) Knowledge of AWS is a must. (EC2, EBS, S3, Route53, Cloudfront, SG, IAM, RDS etc.) Strong foundation in Docker internals and troubleshooting. Should know at least one configuration management tool – Chef/Ansible/Puppet Good to have experience at least in one scripting language – Bash/Python/Ruby Experience is an at- least one NoSQL Database Systems is a plus. – Elasticsearch/Mongodb/Redis/Cassandra Experience in a CI tool like Jenkins is preferred. Good understanding of how a 3-tier architecture works. Basic knowledge in any revision control tools like Git/Subversion etc. Should have experience working with monitoring tools like Nagios, Newrelic etc. Should be proficient in log management using tools like rsyslog, logstash etc. Working knowledge of the following items – cron, haproxy/nginx, lvm, MySql, BIND (DN S), iptables. Experience in Atlassian Tools – Jira, Hipchat,Confluence will be a plus. Experience: 5+ years Location: Bangalore If the above description is of your interest, please revert to us with your updated resume to teamhr@carmatec.com Apply now Show more Show less
Posted 1 week ago
3.0 - 4.0 years
0 Lacs
Gurugram, Haryana, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job purpose: Need to work as a Technology or Functional Consultant in FinCrime solutions modernisation and transformation projects Should exhibit understanding of financial services during the client discussions and be able to articulate the client requirements into tech specs Contribute as team player in a team of consultants to be able to deliver large technology programs Work Experience Requirements Understand high-level business requirements and relate them to appropriate AML / FinCrime product capabilities Define and validate customisation needs for AML products as per client requirements. Review client processes and workflows and make recommendations to the client to maximise benefits from the AML Product. Show in-depth knowledge on best banking practices and AML product modules Prior experience in one of more COTS such as NetReveal , Norkom, Actimize, SAS AML VI/VIA, fircosoft or Quantexa Your client responsibilities: Need to work as a Technical Business Systems Analyst in one or more FinCrime projects. Interface and communicate with the onsite coordinators Completion of assigned tasks on time and regular status reporting to the lead Regular status reporting to the Manager and onsite coordinators Interface with the customer representatives as and when needed Willing to travel to the customers locations on need basis Mandatory skills: Technical: Expert in the following NetReveal modules: Scenario Manager Configuration, Application Builder, Base Platform, Workflow Configurator, Services Manager, Batch bridge, Scheduling Configuration, Command and Control, AML module, Expert in Velocity template. NetReveal Optimization module, Multi-entity and mutli-currency platform, Cloud platform, REST API development using Java. CI/CD technologies (BitBucket, Jenkins, Nexus, Serena). Container Technologies such as Docker, Kubernetes. NetReveal v7.4 or above, Proficient in Oracle SQL, PL/SQL, Websphere Application Server Experience in Agile Methodology SQL and Understanding of Bigdata tech such as Spark, Hadoop, or Elasticsearch Scripting/ Programming: At least one programming/scripting language amongst Python, Java or Unix Shell Script Experience in product migration, implementation - preferably been part of at least 1 AML implementations Act as the Subject Matter Expert (SME) and possess an excellent functional/operational knowledge of the activities performed by the various teams Should Posses high-level understanding of infrastructure designs, data model and application/business architecture. Functional : Thorough knowledge of the AML/CTF transactions monitoring, KYC, Sanctions process Thorough knowledge on Transaction monitoring and scenarios Should have developed one or more modules worked on KYC - know your customer, CDD- customer due diligence, EDD - enhanced due diligence, sanction screening, PE - politically exposed person, adverse media screening, TM- transaction monitoring, CM- Case Management. Thorough knowledge of case management workflows Preferred Work Location: This position offers flexibility to work from any EY GDS office in India Education And Experience – Mandatory MBA/ MCA/ BE/ BTech or equivalent with banking industry experience of 3 to 4 years EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 1 week ago
3.0 - 4.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job purpose: Need to work as a Technology or Functional Consultant in FinCrime solutions modernisation and transformation projects Should exhibit understanding of financial services during the client discussions and be able to articulate the client requirements into tech specs Contribute as team player in a team of consultants to be able to deliver large technology programs Work Experience Requirements Understand high-level business requirements and relate them to appropriate AML / FinCrime product capabilities Define and validate customisation needs for AML products as per client requirements. Review client processes and workflows and make recommendations to the client to maximise benefits from the AML Product. Show in-depth knowledge on best banking practices and AML product modules Prior experience in one of more COTS such as NetReveal , Norkom, Actimize, SAS AML VI/VIA, fircosoft or Quantexa Your client responsibilities: Need to work as a Technical Business Systems Analyst in one or more FinCrime projects. Interface and communicate with the onsite coordinators Completion of assigned tasks on time and regular status reporting to the lead Regular status reporting to the Manager and onsite coordinators Interface with the customer representatives as and when needed Willing to travel to the customers locations on need basis Mandatory skills: Technical: Expert in the following NetReveal modules: Scenario Manager Configuration, Application Builder, Base Platform, Workflow Configurator, Services Manager, Batch bridge, Scheduling Configuration, Command and Control, AML module, Expert in Velocity template. NetReveal Optimization module, Multi-entity and mutli-currency platform, Cloud platform, REST API development using Java. CI/CD technologies (BitBucket, Jenkins, Nexus, Serena). Container Technologies such as Docker, Kubernetes. NetReveal v7.4 or above, Proficient in Oracle SQL, PL/SQL, Websphere Application Server Experience in Agile Methodology SQL and Understanding of Bigdata tech such as Spark, Hadoop, or Elasticsearch Scripting/ Programming: At least one programming/scripting language amongst Python, Java or Unix Shell Script Experience in product migration, implementation - preferably been part of at least 1 AML implementations Act as the Subject Matter Expert (SME) and possess an excellent functional/operational knowledge of the activities performed by the various teams Should Posses high-level understanding of infrastructure designs, data model and application/business architecture. Functional : Thorough knowledge of the AML/CTF transactions monitoring, KYC, Sanctions process Thorough knowledge on Transaction monitoring and scenarios Should have developed one or more modules worked on KYC - know your customer, CDD- customer due diligence, EDD - enhanced due diligence, sanction screening, PE - politically exposed person, adverse media screening, TM- transaction monitoring, CM- Case Management. Thorough knowledge of case management workflows Preferred Work Location: This position offers flexibility to work from any EY GDS office in India Education And Experience – Mandatory MBA/ MCA/ BE/ BTech or equivalent with banking industry experience of 3 to 4 years EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Elasticsearch is a powerful search and analytics engine used by businesses worldwide to manage and analyze their data efficiently. In India, the demand for Elasticsearch professionals is on the rise, with many companies seeking skilled individuals to work on various projects involving data management, search capabilities, and more.
These cities are known for their thriving tech industries and have a high demand for Elasticsearch professionals.
The salary range for Elasticsearch professionals in India varies based on experience and skill level. Entry-level positions can expect to earn around INR 4-6 lakhs per annum, while experienced professionals can earn upwards of INR 15 lakhs per annum.
A typical career path in Elasticsearch may involve starting as a Junior Developer, moving on to become a Senior Developer, and eventually progressing to a Tech Lead position. With experience and expertise, one can also explore roles such as Solution Architect or Data Engineer.
Apart from Elasticsearch, professionals in this field are often expected to have knowledge of the following skills: - Apache Lucene - Java programming - Data modeling - RESTful APIs - Database management systems
As you explore job opportunities in Elasticsearch in India, remember to continuously enhance your skills and knowledge in this field. Prepare thoroughly for interviews and showcase your expertise confidently. With the right mindset and preparation, you can excel in your Elasticsearch career and contribute significantly to the tech industry in India. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2