Home
Jobs
Companies
Resume

1767 Querying Jobs

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 - 12.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Chennai, India The Opportunity Anthology delivers education and technology solutions so that students can reach their full potential and learning institutions can thrive. Our mission is to empower educators and institutions with meaningful innovation that’s simple and intelligent, inspiring student success and institutional growth. The Power of Together is built on having a diverse and inclusive workforce. We are committed to making diversity, inclusion, and belonging a foundational part of our hiring practices and who we are as a company. For more information about Anthology and our career opportunities, please visit www.anthology.com. This role focuses on Anthology Ally, a revolutionary product that makes digital course content more accessible. As the accessibility of digital course content becomes increasingly important worldwide, institutions must address long-standing and often overbearing challenges. Anthology’s Ally engineering team is responsible for developing industry-leading tools to improve accessibility through inclusivity, sustainability, and automation for all students. As a Staff Software Engineer on our team, you will design, develop, and maintain features of the Ally product. You’ll also communicate and partner cross-functionally with teams in product and software development. In this role, you will work on an ethical product, using Scala for the backend and JavaScript for the frontend. We run our applications in the AWS cloud and use Git for version control. You’ll work on a distributed team, collaborating with colleagues around the globe. The Candidate Required skills/qualifications: 10-12 years of relevant experience Good abstract and critical thinking skills Familiarity with the full-cycle development process Experience developing, building, testing, deploying, and operating applications Experience working with cloud technologies Awareness of how distributed systems work Strong command of backend programming languages (Java, JavaScript, Python, etc.) Familiarity with relational database design and querying concepts Willingness to break things and make them work again Knowledge of and experience with CI/CD principles and tools (Jenkins or Azure Pipelines) Fluency in written and spoken English Preferred Skills/qualifications Experience leading a team Command line scripting knowledge in a Linux-like environment Knowledge of cloud computing (AWS) Experience with IntelliJ IDEA (or other IDE) Experience with a version control system (Git) Experience with a bug-tracking system (JIRA) Experience with a continuous integration system and continuous delivery practices Functional programming experience such as Haskell or Scala Experience with front-end development or interest in learning (Angular) This job description is not designed to contain a comprehensive listing of activities, duties, or responsibilities that are required. Nothing in this job description restricts management's right to assign or reassign duties and responsibilities at any time. Anthology is an equal employment opportunity/affirmative action employer and considers qualified applicants for employment without regard to race, gender, age, color, religion, national origin, marital status, disability, sexual orientation, gender identity/expression, protected military/veteran status, or any other legally protected factor. Show more Show less

Posted 17 hours ago

Apply

3.0 years

8 - 9 Lacs

Hyderābād

On-site

You’re ready to gain the skills and experience needed to grow within your role and advance your career — and we have the perfect software engineering opportunity for you. As a Data Engineer III at JPMorgan Chase within the Consumer & Community Banking Technology Team, you are part of an agile team that works to enhance, design, and deliver the software components of the firm’s state-of-the-art technology products in a secure, stable, and scalable way. As an emerging member of a software engineering team, you execute software solutions through the design, development, and technical troubleshooting of multiple components within a technical product, application, or system, while gaining the skills and experience needed to grow within your role. Job responsibilities Executes standard software solutions, design, development, and technical troubleshooting Writes secure and high-quality code using the syntax of at least one programming language with limited guidance Designs, develops, codes, and troubleshoots with consideration of upstream and downstream systems and technical implications Applies knowledge of tools within the Software Development Life Cycle toolchain to improve the value realized by automation Applies technical troubleshooting to break down solutions and solve technical problems of basic complexity Gathers, analyzes, synthesizes, and develops visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems. Proactively identifies hidden problems and patterns in data and uses these insights to drive improvements to coding hygiene and system architecture. Design & develop data pipelines end to end using PySpark, Java, Python and AWS Services. Utilize Container Orchestration services including Kubernetes, and a variety of AWS tools and services. Learns and applies system processes, methodologies, and skills for the development of secure, stable code and systems Adds to team culture of diversity, equity, inclusion, and respect Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 3+ years of applied experience. Hands-on practical experience in system design, application development, testing, and operational stability Experience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming languages and database querying languages Hands-on practical experience in developing spark-based Frameworks for end-to-end ETL, ELT & reporting solutions using key components like Spark & Spark Streaming. Proficient in coding in one or more Coding languages – Core Java, Python and PySpark Experience with Relational and Datawarehouse databases, Cloud implementation experience with AWS including: AWS Data Services: Proficiency in Lake formation, Glue ETL (or) EMR, S3, Glue Catalog, Athena, Airflow (or) Lambda + Step Functions + Event Bridge, ECS Cluster and ECS Apps Data De/Serialization: Expertise in at least 2 of the formats: Parquet, Iceberg, AVRO, JSON AWS Data Security: Good Understanding of security concepts such as: Lake formation, IAM, Service roles, Encryption, KMS, Secrets Manager Proficiency in automation and continuous delivery methods. Preferred qualifications, capabilities, and skills Experience in Snowflake nice to have. Solid understanding of agile methodologies such as CI/CD, Applicant Resiliency, and Security. In-depth knowledge of the financial services industry and their IT systems. Practical cloud native experience preferably AWS.

Posted 18 hours ago

Apply

2.0 years

0 Lacs

Hyderābād

On-site

JOB DESCRIPTION You’re ready to gain the skills and experience needed to grow within your role and advance your career — and we have the perfect software engineering opportunity for you. As a Software Engineer II at JPMorgan Chase within the Corporate Technology, you are part of an agile team that works to enhance, design, and deliver the software components of the firm’s state-of-the-art technology products in a secure, stable, and scalable way. As an emerging member of a software engineering team, you execute software solutions through the design, development, and technical troubleshooting of multiple components within a technical product, application, or system, while gaining the skills and experience needed to grow within your role. Job responsibilities Executes standard software solutions, design, development, and technical troubleshooting Building pipelines in spark, tuning spark queries Writes secure and high-quality code using the syntax of at least one programming language with limited guidance Designs, develops, codes, and troubleshoots with consideration of upstream and downstream systems and technical implications Applies knowledge of tools within the Software Development Life Cycle toolchain to improve the value realized by automation Applies technical troubleshooting to break down solutions and solve technical problems of basic complexity Gathers, analyzes, and draws conclusions from large, diverse data sets to identify problems and contribute to decision-making in service of secure, stable application development Learns and applies system processes, methodologies, and skills for the development of secure, stable code and systems Stay up-to-date with the latest advancements in GenAI and LLM technologies and incorporate them into our data engineering practices. Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 2+ years applied experience Hands-on practical experience in system design, application development, testing, and operational stability Experience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming languages and database querying languages Background with Machine Learning Frameworks and Big Data technologies such as Hadoop. Strong experience in programming languages such as Java or Python Python Machine Learning library and ecosystem experience ( Pandas and Numpy etc) Experience with Cloud technologies such as AWS or Azure. Experience working with databases such as Cassandra, MongoDB or Teradata Experience across the whole Software Development Life Cycle Exposure to agile methodologies such as CI/CD, Application Resiliency, and Security Experience with Generative AI and Large Language Models, and experience integrating these technologies into data workflows Preferred qualifications, capabilities, and skills Familiarity with modern front-end technologies Exposure to cloud technologies ABOUT US

Posted 18 hours ago

Apply

3.0 years

1 - 9 Lacs

Hyderābād

On-site

We have an exciting and rewarding opportunity for you to take your software engineering career to the next level. As a Software Engineer III at JPMorgan Chase within the Employee Platforms team, you serve as a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm’s business objectives. Job responsibilities Executes software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems Creates secure and high-quality production code and maintains algorithms that run synchronously with appropriate systems Produces architecture and design artifacts for complex applications while being accountable for ensuring design constraints are met by software code development Gathers, analyzes, synthesizes, and develops visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems Proactively identifies hidden problems and patterns in data and uses these insights to drive improvements to coding hygiene and system architecture Contributes to software engineering communities of practice and events that explore new and emerging technologies Adds to team culture of diversity, equity, inclusion, and respect Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 3+ years applied experience in ServiceNow application Hands-on practical experience in system design, application development, testing, and operational stability Proficient in coding in one or more languages Experience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming languages and database querying languages Overall knowledge of the Software Development Life Cycle Solid understanding of agile methodologies such as CI/CD, Application Resiliency, and Security Demonstrated knowledge of software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.) Preferred qualifications, capabilities, and skills Familiarity with modern front-end technologies Exposure to cloud technologies

Posted 18 hours ago

Apply

8.0 years

28 - 30 Lacs

Hyderābād

On-site

Experience - 8+ Years Budget - 30 LPA (Including Variable Pay) Location - Bangalore, Hyderabad, Chennai (Hybrid) Shift Timing - 2 PM - 11 PM ETL Development Lead (8+ years) Experience with Leading and mentoring a team of Talend ETL developers. Providing technical direction and guidance on ETL/Data Integration development to the team. Designing complex data integration solutions using Talend & AWS. Collaborating with stakeholders to define project scope, timelines, and deliverables. Contributing to project planning, risk assessment, and mitigation strategies. Ensuring adherence to project timelines and quality standards. Strong understanding of ETL/ELT concepts, data warehousing principles, and database technologies. Design, develop, and implement ETL (Extract, Transform, Load) processes using Talend Studio and other Talend components. Build and maintain robust and scalable data integration solutions to move and transform data between various source and target systems (e.g., databases, data warehouses, cloud applications, APIs, flat files). Develop and optimize Talend jobs, workflows, and data mappings to ensure high performance and data quality. Troubleshoot and resolve issues related to Talend jobs, data pipelines, and integration processes. Collaborate with data analysts, data engineers, and other stakeholders to understand data requirements and translate them into technical solutions. Perform unit testing and participate in system integration testing of ETL processes. Monitor and maintain Talend environments, including job scheduling and performance tuning. Document technical specifications, data flow diagrams, and ETL processes. Stay up-to-date with the latest Talend features, best practices, and industry trends. Participate in code reviews and contribute to the establishment of development standards. Proficiency in using Talend Studio, Talend Administration Center/TMC, and other Talend components. Experience working with various data sources and targets, including relational databases (e.g., Oracle, SQL Server, MySQL, PostgreSQL), NoSQL databases, AWS cloud platform, APIs (REST, SOAP), and flat files (CSV, TXT). Strong SQL skills for data querying and manipulation. Experience with data profiling, data quality checks, and error handling within ETL processes. Familiarity with job scheduling tools and monitoring frameworks. Excellent problem-solving, analytical, and communication skills. Ability to work independently and collaboratively within a team environment. Basic Understanding of AWS Services i.e. EC2 , S3 , EFS, EBS, IAM , AWS Roles , CloudWatch Logs, VPC, Security Group , Route 53, Network ACLs, Amazon Redshift, Amazon RDS, Amazon Aurora, Amazon DynamoDB. Understanding of AWS Data integration Services i.e. Glue, Data Pipeline, Amazon Athena , AWS Lake Formation, AppFlow, Step Functions Preferred Qualifications: Experience with Leading and mentoring a team of 8+ Talend ETL developers. Experience working with US Healthcare customer.. Bachelor's degree in Computer Science, Information Technology, or a related field. Talend certifications (e.g., Talend Certified Developer), AWS Certified Cloud Practitioner/Data Engineer Associate. Experience with AWS Data & Infrastructure Services.. Basic understanding and functionality for Terraform and Gitlab is required. Experience with scripting languages such as Python or Shell scripting. Experience with agile development methodologies. Understanding of big data technologies (e.g., Hadoop, Spark) and Talend Big Data platform. Job Type: Full-time Pay: ₹2,800,000.00 - ₹3,000,000.00 per year Schedule: Day shift Work Location: In person

Posted 18 hours ago

Apply

0 years

2 - 9 Lacs

Hyderābād

On-site

Job description Some careers shine brighter than others. If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Consultant Specialist. In this role, you will: Design, build, and maintain scalable, reliable, and secure infrastructure using tools like Terraform and Ansible. Automate deployment pipelines and infrastructure provisioning. Develop and maintain Continuous Integration/Continuous Deployment (CI/CD) pipelines using tools like Jenkins Ensure smooth and efficient code deployment processes. Manage and optimize Google Cloud infrastructure. Implement cost optimization strategies and monitor cloud resource usage. Work closely with development teams to ensure smooth integration of new features and services. Troubleshoot and resolve infrastructure and application issues. Implement and maintain secrets management tools like HashiCorp Vault. Hands-on experience with version control systems like Git. Requirements To be successful in this role, you should meet the following requirements: Proficiency in scripting languages like Python, PySpark, Shell / Bash and YAML. Experience with containerization tools like Docker and orchestration platforms like Kubernetes. Proven experience with GCP specifically with BigQuery. Strong working knowledge of Google Cloud, Python, BigQuery, KubeFlow Strong SQL skills and experience querying large datastes in BigQuery. Good working knowledge of GitHub DevOps principles & automation tools (Jenkins, Ansible, Nexus, CI/CD) Terraform working experience to setup the infrastructure. Agile development principles (Scrum, Jira, Confluence) ESSENTIAL SKILLS (non-technical) Excellent communication skills Ability to explain complex ideas Ability to work as part of a team Ability to work in a team that is located across multiple regions / time zones Willingness to adapt and learn new things Willingness to take ownership of tasks Strong collaboration skills and experience working in diverse, global teams. Excellent problem-solving skills and ability to work independently and as part of a team. You’ll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by – HSBC Software Development India

Posted 18 hours ago

Apply

1.0 - 2.0 years

0 Lacs

Hyderābād

On-site

General information Country India State Telangana City Hyderabad Job ID 44779 Department Development Experience Level EXECUTIVE Employment Status FULL_TIME Workplace Type On-site Description & Requirements As an Associate Machine Learning Engineer / Data Scientist, you will contribute to the advancement of research projects in artificial intelligence and machine learning. Your responsibilities will encompass areas such as large language models, image processing, and sentiment analysis. You will work collaboratively with development partners to incorporate AI research into products such as Digital Assistant and Document Capture. Essential Duties: Model Development: Assist in designing and implementing AI/ML models. Contribute to building innovative models and integrating them into existing systems. Fine-tuning Models: Support the fine-tuning of pre-trained models for specific tasks and domains. Ensure models are optimized for accuracy and efficiency. Data Clean-up: Conduct data analysis and pre-processing to ensure the quality and relevance of training datasets. Implement data cleaning techniques. Natural Language Processing (NLP): Assist in the development of NLP tasks like sentiment analysis, text classification, and language understanding. Large Language Models (LLMs): Work with state-of-the-art LLMs and explore their applications in various domains. Support continuous improvement and adaptation of LLMs. Research and Innovation: Stay updated with advancements in AI/ML, NLP, and LLMs. Experiment with new approaches to solve complex problems and improve methodologies. Deployment and Monitoring: Collaborate with DevOps teams to deploy AI/ML models. Implement monitoring mechanisms to track model performance. Documentation: Maintain clear documentation of AI/ML processes, models, and improvements to ensure knowledge sharing and collaboration. Basic Qualifications: Educational Background Programming and Tools Experience 1-2 years of total industry experience Minimum 6 months experience in ML & Data Science Skills Problem-Solving and Analytical Skills Good oral and written communication skills. Bachelor’s or Master’s degree in Computer Science, Electrical Engineering, Mathematics, Statistics or a related field. Specialization or coursework in AI, ML, Statistics & Probability, DL, Computer Vision, Signal Processing, or NLP/NLU is a plus. Proficiency in programming languages commonly used in AI and ML, such as Python or R & querying languages like SQL. Experience in Cloud computing infrastructures like AWS Sagemaker or Azure ML for implementing ML solutions is highly preferred. Experience with relevant libraries and frameworks, such as scikit-learn, Keras, TensorFlow, PyTorch, or NLTK is a plus. This role offers a great opportunity to work with cutting-edge AI/ML technologies and contribute to innovative projects in a collaborative environment. About Infor Infor is a global leader in business cloud software products for companies in industry specific markets. Infor builds complete industry suites in the cloud and efficiently deploys technology that puts the user experience first, leverages data science, and integrates easily into existing systems. Over 60,000 organizations worldwide rely on Infor to help overcome market disruptions and achieve business-wide digital transformation. For more information visit www.infor.com Our Values At Infor, we strive for an environment that is founded on a business philosophy called Principle Based Management™ (PBM™) and eight Guiding Principles: integrity, stewardship & compliance, transformation, principled entrepreneurship, knowledge, humility, respect, self-actualization. Increasing diversity is important to reflect our markets, customers, partners, and communities we serve in now and in the future. We have a relentless commitment to a culture based on PBM. Informed by the principles that allow a free and open society to flourish, PBM™ prepares individuals to innovate, improve, and transform while fostering a healthy, growing organization that creates long-term value for its clients and supporters and fulfillment for its employees. Infor is an Equal Opportunity Employer. We are committed to creating a diverse and inclusive work environment. Infor does not discriminate against candidates or employees because of their sex, race, gender identity, disability, age, sexual orientation, religion, national origin, veteran status, or any other protected status under the law. If you require accommodation or assistance at any time during the application or selection processes, please submit a request by following the directions located in the FAQ section at the bottom of the infor.com/about/careers webpage.

Posted 18 hours ago

Apply

8.0 years

28 - 30 Lacs

Pune

On-site

Experience - 8+ Years Budget - 30 LPA (Including Variable Pay) Location - Bangalore, Hyderabad, Chennai (Hybrid) Shift Timing - 2 PM - 11 PM ETL Development Lead (8+ years) Experience with Leading and mentoring a team of Talend ETL developers. Providing technical direction and guidance on ETL/Data Integration development to the team. Designing complex data integration solutions using Talend & AWS. Collaborating with stakeholders to define project scope, timelines, and deliverables. Contributing to project planning, risk assessment, and mitigation strategies. Ensuring adherence to project timelines and quality standards. Strong understanding of ETL/ELT concepts, data warehousing principles, and database technologies. Design, develop, and implement ETL (Extract, Transform, Load) processes using Talend Studio and other Talend components. Build and maintain robust and scalable data integration solutions to move and transform data between various source and target systems (e.g., databases, data warehouses, cloud applications, APIs, flat files). Develop and optimize Talend jobs, workflows, and data mappings to ensure high performance and data quality. Troubleshoot and resolve issues related to Talend jobs, data pipelines, and integration processes. Collaborate with data analysts, data engineers, and other stakeholders to understand data requirements and translate them into technical solutions. Perform unit testing and participate in system integration testing of ETL processes. Monitor and maintain Talend environments, including job scheduling and performance tuning. Document technical specifications, data flow diagrams, and ETL processes. Stay up-to-date with the latest Talend features, best practices, and industry trends. Participate in code reviews and contribute to the establishment of development standards. Proficiency in using Talend Studio, Talend Administration Center/TMC, and other Talend components. Experience working with various data sources and targets, including relational databases (e.g., Oracle, SQL Server, MySQL, PostgreSQL), NoSQL databases, AWS cloud platform, APIs (REST, SOAP), and flat files (CSV, TXT). Strong SQL skills for data querying and manipulation. Experience with data profiling, data quality checks, and error handling within ETL processes. Familiarity with job scheduling tools and monitoring frameworks. Excellent problem-solving, analytical, and communication skills. Ability to work independently and collaboratively within a team environment. Basic Understanding of AWS Services i.e. EC2 , S3 , EFS, EBS, IAM , AWS Roles , CloudWatch Logs, VPC, Security Group , Route 53, Network ACLs, Amazon Redshift, Amazon RDS, Amazon Aurora, Amazon DynamoDB. Understanding of AWS Data integration Services i.e. Glue, Data Pipeline, Amazon Athena , AWS Lake Formation, AppFlow, Step Functions Preferred Qualifications: Experience with Leading and mentoring a team of 8+ Talend ETL developers. Experience working with US Healthcare customer.. Bachelor's degree in Computer Science, Information Technology, or a related field. Talend certifications (e.g., Talend Certified Developer), AWS Certified Cloud Practitioner/Data Engineer Associate. Experience with AWS Data & Infrastructure Services.. Basic understanding and functionality for Terraform and Gitlab is required. Experience with scripting languages such as Python or Shell scripting. Experience with agile development methodologies. Understanding of big data technologies (e.g., Hadoop, Spark) and Talend Big Data platform. Job Type: Full-time Pay: ₹2,800,000.00 - ₹3,000,000.00 per year Schedule: Day shift Work Location: In person

Posted 18 hours ago

Apply

2.0 years

4 - 8 Lacs

Mumbai

On-site

Job Information Industry IT Services Salary None Date Opened 06/16/2025 Job Type Software Engineering Work Experience 2-4 years City Mumbai State/Province Maharashtra Country India Zip/Postal Code 400080 Job Description What we want: Candidate Should have hands-on experience in Angular and Java/ Core Java. Should also be familiar with HTML5, CSS3, Bootstrap, JavaScript, jQuery, Typescript, Spring boot, MySQL, Oracle. Who we are: Vertoz (NSEI: VERTOZ), an AI-powered MadTech and CloudTech Platform offering Digital Advertising, Marketing and Monetization (MadTech) & Digital Identity, and Cloud Infrastructure (CloudTech) caters to Businesses, Digital Marketers, Advertising Agencies, Digital Publishers, Cloud Providers, and Technology companies. For more details, please visit our website here. What you will do: Strong understanding of Java programming language, including object-oriented design principles, data structures, and algorithms. Deep knowledge and hands-on experience with Angular framework, including Angular 4+ versions. Ability to develop robust and scalable frontend applications using Angular. Ability to work on both frontend and backend development tasks. Proficiency in developing RESTful APIs and implementing frontend-backend communication. Solid understanding of JavaScript and TypeScript programming languages. Experience in writing clean, maintainable, and efficient code using these languages. Proficient in HTML5 and CSS3 for building responsive and visually appealing user interfaces. Experience working with relational databases like MySQL, Oracle, including database design, querying, and optimization. Familiarity with Git for version control, including branching, merging, and pull requests workflows. Strong analytical and problem-solving abilities to debug issues, optimize performance, and implement efficient solutions across the full stack. Requirements Strong hands-on experience in Angular 4 +Version and, Spring boot, MySQL, core Java, and Hibernate. Good understanding of HTML5, CSS3, Bootstrap, JavaScript, jQuery, Typescript, Spring boot, MySQL, and core Java. Excellent problem-solving, design. Benefits No dress codes Flexible working hours 5 days working 24 Annual Leaves International Presence Celebrations Team outings

Posted 18 hours ago

Apply

2.0 - 4.0 years

0 - 0 Lacs

India

On-site

Job Description for MIS Executive for ShreeRam Textiles Job Title – MIS Executive Location – Dadar East (Mumbai) Experience – 2-4 years Job type – Full time Job Summary: We are looking for a detail-oriented and analytical MIS Executive to manage our data systems, generate reports, and support data-driven decision-making processes. The ideal candidate will be proficient in Excel, databases, and reporting tools, and able to interpret large volumes of data effectively . Key Responsibilities: · Design and maintain daily, weekly, and monthly MIS reports. · Automate repetitive tasks and reports using VBA (Macros) in Excel to improve efficiency. · Develop custom Excel tools and dashboards for data analysis and visualization. · Extract, clean, and transform data from multiple sources (databases, spreadsheets, etc.). · Support cross-functional teams by generating insights and analytical summaries. · Ensure data accuracy, consistency, and timeliness in reporting processes. · Maintain and troubleshoot existing VBA scripts and provide solutions to enhance them. · Prepare management reports and KPI dashboards to assist decision-making. · Assist in audits and compliance by maintaining proper data records and backups. · Skills · Advanced MS Excel skills, including Pivot Tables, VLOOKUP/XLOOKUP, Power Query, and VBA (Macro Programming) . · Strong experience with VBA scripting for automating Excel tasks and developing custom forms. · Good understanding of SQL and data querying from relational databases. · Experience with Power BI , Tableau , or similar data visualization tools is a plus. · Strong analytical mindset and attention to detail. · Ability to work independently and in a team with cross-functional stakeholders. · Effective communication skills, both written and verbal . · Qualifications · Bachelor’s degree in Computer Science , Information Technology , Statistics , Business Administration , or a related field. · 2–4 years of experience in an MIS or Data Analyst role, with hands-on VBA programming experience Job Type: Full-time Pay: ₹20,000.00 - ₹25,000.00 per month Schedule: Day shift Work Location: In person Expected Start Date: 20/06/2025

Posted 18 hours ago

Apply

5.0 years

0 Lacs

Mumbai

On-site

This is a Public document. Job Description for a Credit Card Transaction Monitoring Manager Reports to: CREDIT CARD POLICY HEAD Location: MUMBAI Job Summary: We are seeking an experienced Credit Card Transaction Monitoring individual (M5-AVP) to join our Credit Risk Management team. The successful candidate will be responsible for monitoring credit card transactions data, identifying potential fraud or high risk transactions with higher bad rates, and implementing rules to prevent such transactions and reduce future financial losses coming from them. This role requires a blend of analytical skills, attention to detail, and knowledge of fraud detection technologies and methodologies. The ideal candidate will have experience in data analysis, risk analytics, and fraud prevention. They should be proficient in SAS, SQL, and Python for data querying and analysis, and have a strong understanding of fraud risk management in the financial services industry Key Responsibilities:  Monitoring Transactions: Responsible for overseeing credit card transactions to detect and prevent fraudulent activities, high risk transactions on various MCC’s such as rental, wallets, fuels etc. this includes analysing transaction patterns and identifying any suspicious behaviour.  Rule Creation: Part of the job would involve creating and implementing rules within the FALCON system to help identify and stop such transactions. Test and refine rules to ensure accuracy and effectiveness. This requires a deep understanding of fraud trends and the ability to translate this knowledge into effective fraud prevention strategies.  Identifying customers who are making non retail transactions and using credit card funds for other than specified retail purposes and building rules to stop or reduce transactions on them by blocking those MID’s, initiating blocking of credit card accounts and credit limit reduction.  Using FICO® Falcon® Fraud Manager: You would likely be using FICO’s Falcon Fraud Manager, which is a leading solution for real-time transaction protection across various channels.  Collaboration: The role may require collaboration with business and financial users to define, test, and deploy these rules based on the organization’s requirements and strategies  Creating and maintaining reports for tracking fraud trends and the performance of fraud prevention strategies.  Querying operational databases and data warehouses using SQL and other tools to extract data for analysis.  Staying up-to-date on current fraud best practices and emerging threats. This is a Public document. Requirements:  5+ years of experience in credit card transaction monitoring or related field  2+ years of experience in a mid-leadership role.  Strong knowledge of Falcon or similar fraud detection systems  Excellent analytical and problem-solving skills  Effective communication and leadership skills  Ability to work in a fast-paced environment  Experience with machine learning or data science  Certification in fraud prevention or related field will be an added advantage

Posted 18 hours ago

Apply

8.0 years

28 - 30 Lacs

Chennai

On-site

Experience - 8+ Years Budget - 30 LPA (Including Variable Pay) Location - Bangalore, Hyderabad, Chennai (Hybrid) Shift Timing - 2 PM - 11 PM ETL Development Lead (8+ years) Experience with Leading and mentoring a team of Talend ETL developers. Providing technical direction and guidance on ETL/Data Integration development to the team. Designing complex data integration solutions using Talend & AWS. Collaborating with stakeholders to define project scope, timelines, and deliverables. Contributing to project planning, risk assessment, and mitigation strategies. Ensuring adherence to project timelines and quality standards. Strong understanding of ETL/ELT concepts, data warehousing principles, and database technologies. Design, develop, and implement ETL (Extract, Transform, Load) processes using Talend Studio and other Talend components. Build and maintain robust and scalable data integration solutions to move and transform data between various source and target systems (e.g., databases, data warehouses, cloud applications, APIs, flat files). Develop and optimize Talend jobs, workflows, and data mappings to ensure high performance and data quality. Troubleshoot and resolve issues related to Talend jobs, data pipelines, and integration processes. Collaborate with data analysts, data engineers, and other stakeholders to understand data requirements and translate them into technical solutions. Perform unit testing and participate in system integration testing of ETL processes. Monitor and maintain Talend environments, including job scheduling and performance tuning. Document technical specifications, data flow diagrams, and ETL processes. Stay up-to-date with the latest Talend features, best practices, and industry trends. Participate in code reviews and contribute to the establishment of development standards. Proficiency in using Talend Studio, Talend Administration Center/TMC, and other Talend components. Experience working with various data sources and targets, including relational databases (e.g., Oracle, SQL Server, MySQL, PostgreSQL), NoSQL databases, AWS cloud platform, APIs (REST, SOAP), and flat files (CSV, TXT). Strong SQL skills for data querying and manipulation. Experience with data profiling, data quality checks, and error handling within ETL processes. Familiarity with job scheduling tools and monitoring frameworks. Excellent problem-solving, analytical, and communication skills. Ability to work independently and collaboratively within a team environment. Basic Understanding of AWS Services i.e. EC2 , S3 , EFS, EBS, IAM , AWS Roles , CloudWatch Logs, VPC, Security Group , Route 53, Network ACLs, Amazon Redshift, Amazon RDS, Amazon Aurora, Amazon DynamoDB. Understanding of AWS Data integration Services i.e. Glue, Data Pipeline, Amazon Athena , AWS Lake Formation, AppFlow, Step Functions Preferred Qualifications: Experience with Leading and mentoring a team of 8+ Talend ETL developers. Experience working with US Healthcare customer.. Bachelor's degree in Computer Science, Information Technology, or a related field. Talend certifications (e.g., Talend Certified Developer), AWS Certified Cloud Practitioner/Data Engineer Associate. Experience with AWS Data & Infrastructure Services.. Basic understanding and functionality for Terraform and Gitlab is required. Experience with scripting languages such as Python or Shell scripting. Experience with agile development methodologies. Understanding of big data technologies (e.g., Hadoop, Spark) and Talend Big Data platform. Job Type: Full-time Pay: ₹2,800,000.00 - ₹3,000,000.00 per year Schedule: Day shift Work Location: In person

Posted 18 hours ago

Apply

1.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Roles and Responsibilities: Coding and abstracting information from provider patient medical records and hospital ancillary records per facility and/or state requirements. Eligibility: Candidate should be a Life science/BPT/Pharm/Nursing. Candidate should have knowledge in Anatomy/Physiology. Medical Transcription background preferred. Assigning appropriate billing codes based on medical documentation using CPT-4 and/or ICD-10 coding guidelines. Querying physicians when code assignments are not straightforward or documentation in the record is inadequate, ambiguous or unclear for coding purposes. Monitoring unbilled accounts report for outstanding and/or un-coded encounters to reduce accounts receivable days. Following strict coding guidelines within established productivity standards. Addressing billing/coding related inquires for providers as needed, U.S. only. Attending meetings and in-service training to enhance coding knowledge, compliance skills, and maintenance of credentials. Maintaining patient confidentiality. Requirements of the role include: 1 plus years of experience working with CPT and ICD-10 coding principles, governmental regulations, protocols and third party requirements regarding medical billing. 1+ year(s) of experience using a computer with Windows PC applications that required you to use a keyboard, navigate screens, and learn new software tools. Ability to work regularly scheduled shifts from Monday-Friday 7:30 am to 5:30p.m IST. Should be specialized in E/M or Surgery coding. Permanent work from Office for Chennai location Show more Show less

Posted 18 hours ago

Apply

10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Dear Job Seekers, Greetings from Voice Bay! We are currently hiring for Machine Learning Engineer , If you are interested, please submit your application. Please find below the JD for your consideration: Work Location – Hyderabad Exp – 4 – 10 Years Work Mode – 5 Days Work From Office Mandatory Key Responsibilities  Design, develop, and implement end-to-end machine learning models, from initial data exploration and feature engineering to model deployment and monitoring in production environments.  Build and optimize data pipelines for both structured and unstructured datasets, focusing on advanced data blending, transformation, and cleansing techniques to ensure data quality and readiness for modeling.  Create, manage, and query complex databases, leveraging various data storage solutions to efficiently extract, transform, and load data for machine learning workflows.  Collaborate closely with data scientists, software engineers, and product managers to translate business requirements into effective, scalable, and maintainable ML solutions.  Implement and maintain robust MLOps practices, including version control, model monitoring, logging, and performance evaluation to ensure model reliability and drive continuous improvement.  Research and experiment with new machine learning techniques, tools, and technologies to enhance our predictive capabilities and operational efficiency. Required Skills & Experience  5+ years of hands-on experience in building, training, and deploying machine learning models in a professional, production-oriented setting.  Demonstrable experience with database creation and advanced querying (e.g., SQL, NoSQL), with a strong understanding of data warehousing concepts.  Proven expertise in data blending, transformation, and feature engineering, adept at integrating and harmonizing both structured (e.g., relational databases, CSVs) and unstructured (e.g., text, logs, images) data.  Strong practical experience with cloud platforms for machine learning development and deployment; significant experience with Google Cloud Platform (GCP) services (e.g., Vertex AI, BigQuery, Dataflow) is highly desirable.  Proficiency in programming languages commonly used in data science (e.g., Python is preferred, R).  Solid understanding of various machine learning algorithms (e.g., regression, classification, clustering, dimensionality reduction) and experience with advanced techniques like Deep Learning, Natural Language Processing (NLP), or Computer Vision.  Experience with machine learning libraries and frameworks (e.g., scikit-learn, TensorFlow, PyTorch).  Familiarity with MLOps tools and practices, including model versioning, monitoring, A/B testing, and continuous integration/continuous deployment (CI/CD) pipelines.  Experience with containerization technologies like Docker and orchestration tools like Kubernetes for deploying ML models as REST APIs.  Proficiency with version control systems (e.g., Git, GitHub/GitLab) for collaborative development. Educational Background  Bachelor's or Master's degree in Statistics, Mathematics, Computer Science, Engineering, Data Science, or a closely related quantitative field.  Alternatively, a significant certification in Data Science, Machine Learning, or Cloud AI combined with relevant practical experience will be considered.  A compelling combination of relevant education and professional experience will also be valued. Interested Candidates can share their Resume to the below mentioned Email I.D tarunrai@voicebaysolutions.in hr@voicebaysolutions.in Show more Show less

Posted 18 hours ago

Apply

175.0 years

0 Lacs

Bengaluru South, Karnataka, India

On-site

Linkedin logo

At American Express, our culture is built on a 175-year history of innovation, shared values and Leadership Behaviors, and an unwavering commitment to back our customers, communities, and colleagues. As part of Team Amex, you'll experience this powerful backing with comprehensive support for your holistic well-being and many opportunities to learn new skills, develop as a leader, and grow your career. Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express. The Financial Data Sourcing Analyst is a critical role in Financial Reporting Quality Assurance Organization (FRQA) within Corporate Controllership, in support of the Regulatory Reporting Automation program. This role is responsible for driving the definition, gathering, exploration, and analysis of finance data and its data sources to deliver the end-to-end automation for our regulatory reporting platforms. The Data Sourcing Architect team oversees the data mapping, profiling and source to target (S2T) analysis mapping for new data requirements of our regulatory reporting automation and systems, as well as leading the coordination / orchestration of the close partnership between Product Owners, Report / Business Owners, Technology and Data Testing teams to determine data / product features of the data solution are put into production with the highest degree of confidence of the data flow and data system requirements or the deactivation / decommission of data sets of financial data systems. The Data Sourcing Analyst must be a highly analytical, well-organized, data-driven individual wit time management and a high degree of technical skills confident in presenting, highly complex data concepts and technicalities in simple terms and pragmatically to the team and associated stakeholders. How will you make an impact in this role? Key Responsibilities: Collaborate with business stakeholders to understand the data needs for regulatory reporting, Translate the business requirements into technical specifications for data solutions. Develop and implement data management strategies for Regulatory Reporting Data Domain(RRDD), Design and maintain RRDD data models to support regulatory reporting and ensure its scalable and flexible. Partner with business, upstream and Technology teams to implement data solutions for regulatory reporting, Monitor and optimize data systems for performance and efficiency Collaborate with data governance team to define standards and ensure data quality and consistency in RRDD Data sourcing gap analysis and profiling of attributes and complete source to target mapping document for regulatory reports automation Conduct data analysis on existing processes and datasets to understand and support Point of Arrival (POA) process designs including migration of RRDD tables from On Prem to BigQuery. Experience to determine portfolios, data elements and grain of data required for designing processes to review data scenarios, providing clarification on how to report on these scenarios in alignment with regulatory guidance. Support development of executable data querying algorithms using tools such as SQL, that enable validation data conformity and expected data system functionality, including replication of deterministic logic and filtering criteria for master and reference data to be used across operational and regulatory reporting processes. Identification of business requirements and development of functional requirement documentation for new data sources and attributes, including design, prototyping, testing, and implementation of report owner and regulatory requirements. Document and understand core components of solution architecture including data patterns, data-related capabilities, and standardization and conformance of disparate datasets. Minimum Qualifications 3+ years of work experience in Data Sourcing and analysis. 3+ years of work experience in Banking / Regulatory / Financial / Technology Services. Product Management, data migration, data analytics working experience is a plus. Experienced in Agile delivery concepts or other project management methodologies. Experience in data analytics, data profiling, Source to Target (S2T) data mapping, analyzing the System of Record (SOR) data and its Data Quality (DQ) rules to identify data issues in SORs. Strong SQL / NoSQL and data analysis experience, able to write / understand complex SQL, HiveQL, with hands-on Oracle SQL and Hive experience. Experience with of MS Excel, Power Query, and other analytical tools, e.g., Tableau. Experience with Python, Google Big Query, PL SQL. Critical thinking and complex problem-solving skills (data application). Excellent written and verbal communications with ability to communicate highly complex concepts and processes in simple terms and pragmatically. A self-starter, proactive team player with excellent relationship building and collaboration skills, facilitating a network of strong relationships across the organization. Preferred Qualifications Knowledge of US Regulatory Reports (Y9C, Y14, Y15, 2052a, amongst others), and general understanding of the banking products. Working exposure in data analysis financial data domains to support regulatory and analytical requirements for large scale banking/financial organizations Experience in Google Cloud capabilities We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations. Show more Show less

Posted 19 hours ago

Apply

0.0 - 2.0 years

0 Lacs

Indore, Madhya Pradesh, India

On-site

Linkedin logo

About Oration AI Oration AI builds fully AI-powered contact centers that transform customer service. Our AI agents handle voice, email, and chat interactions with natural conversations that solve real problems. We're a team of 7 innovators with a track record of successful AI ventures in Europe and India. We're working with cutting-edge technology that's already serving major enterprises. This is an incredibly exciting time to join Oration AI. You'll be: Working directly with the latest AI innovations and technologies Learning from a team of experienced founders with successful exits Contributing to a product that's already serving major enterprises Growing your career in a fast-paced, innovative environment Making a real impact in shaping the future of customer service Getting hands-on experience with technologies that will define the next decade Role Overview We are looking for a technically skilled Support Engineer who is passionate about customer success and has a strong foundation in modern web technologies. This role is perfect for someone who enjoys solving complex technical problems while maintaining excellent communication with customers. Key Responsibilities Provide technical support to enterprise customers using our AI contact center platform. Help customers make the best use of the product and solve their unique problems. Work closely with customers and engineers to understand their requirements and provide appropriate solutions and improve features. Required Skills Strong understanding of prompt engineering Hands on experience with writing prompts for various LLMs and use cases. Basic knowledge of AI/ML concepts Good understanding of web technologies (Next.js, React) and how most software systems are built. Bonus Skills Knowledge of API integrations and web services Basic Knowlege of git and version control Proficiency in database querying (PostgreSQL, SQL) Qualifications 0-2 years of experience in technical support or software development No degree requirements. Experience with customer-facing roles is a plus Perks Above market pay Company provides you with a top end MacBook Responsibility driven work culture No strict office timings or leave policy Location Indore, India No work-from-home policy Show more Show less

Posted 19 hours ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Responsibilities: We are seeking an experienced Data Scientist to lead the development of a Data Science program . You will work closely with various stakeholders to derive deep industry knowledge across paper, water, leather, and performance chemical industries . You will help develop a data strategy for the company including collection of the right data, creation of the data science project portfolio, partnering with external providers , and augmenting capabilities with additional internal hires. A large part of the job is communicating and developing relationships with key stakeholders and subject matter experts to tee up proofs of concept (PoC) projects to demonstrate how data science can solve old problems in unique and novel ways . You will not have a large internal team to rely on, at least initially, so individual expertise, breadth of data science knowledge , and ability to partner with external companies will be essential for success. In addition to the pure data science problems, you will be working closely with a multi-disciplinary team consisting of sensor scientists, software engineers, network engineers, mechanical/electrical engineers, and chemical engineers in the development and deployment of IoT solutions . Basic Qualification: Bachelor’s degree in a quantitative field such as Data Science, Statistics, Applied Mathematics, Physics, Engineering, or Computer Science 5+ years of relevant working experience in an analytical role involving data extraction, analysis, and visualization and expertise in the following areas: Expertise in one or more programming languages : R, Python, MATLAB, JMP, Minitab, Java, C++, Scala Key libraries such as Sklearn, XGBoost, GLMNet, Dplyr, ggplot, RShiny Experience and knowledge of data mining algorithms including supervised and unsupervised machine learning techniques in areas such as Gradient Boosting, Decision Trees, Multivariate Regressions, Logistic Regression, Neural Network, Random Forest, SVM, Naive Bayes, Time Series, Optimization Microsoft IoT/data science toolkit : Azure Machine Learning, Datalake, Datalake Analytics, Workbench, IoT Hub, Stream Analytics, CosmosDB, Time Series Insights, Power BI Data querying languages : SQL, Hadoop/Hive A demonstrated record of success with a verifiable portfolio of problems tackled Preferred Qualifications: Master’s or PhD degree in a quantitative field such as Data Science, Statistics, Applied Mathematics, Physics, Engineering, or Computer Science Experience in the specialty chemicals sector or similar industry Background in engineering, especially Chemical Engineering Experience starting up a data science program Experience working with global stakeholders Experience working in a start-up environment, preferably in an IoT company Knowledge in quantitative modeling tools and statistical analysis Personality Traits: A strong business focus, ownership, and inner self-drive to develop data science solutions to real-world customers with tangible impact. Ability to collaborate effectively with multi-disciplinary and passionate team members . Ability to communicate with a diverse set of stakeholders . Strong planning and organization skills , with the ability to manage multiple complex projects . A life-long learner who constantly updates skills. Show more Show less

Posted 19 hours ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

About Zoca Zoca is a fast-growing local business marketing platform that helps salons, spas, and wellness businesses attract, convert, and retain more clients through AI-powered tools. Backed by real-time data and automation, we simplify growth for beauty professionals. Responsibilities Payment Monitoring: Actively track and flag paid accounts with missed payments, identifying churn-risk users early. Create cadences for follow-ups and preventive actions to reduce churn. Funnel & Account Behavior Analysis: Analyze behavior patterns of users who are at risk of churn , reduced activity, support interactions — and surface insights for account managers. Operationalize Follow-ups: Build lightweight, automated trackers in Google Sheets/Excel that require minimal input but drive high visibility and accountability across the CX and AM teams. Ticketing System Management: Use Linear to manage customer tickets, tag and prioritize issues, assign responsibilities, and maintain visibility on client-side blockers. Client Problem Solving: Work closely with Account Managers to resolve client escalations, track SLA compliance, and unblock issues that impact renewals or satisfaction. Automation & Process Ownership: Identify repetitive manual tasks and automate them using Google Sheets functions, conditional logic, or third-party tools. Design SOPs for recurring workflows. Weekly CX Ops Reporting: Maintain and update weekly CX dashboards (missed payments, account health, resolution metrics) and share key insights with leadership. Requirements Strong skills in Google Sheets / Excel (vlookups, conditional formatting, filters, dashboards). Working knowledge of SQL for querying client/product/ticket-related data. Experience or familiarity with ticketing tools – ideally Linear (or equivalent tools like Zendesk, Intercom, Jira). Analytical mindset with a proactive problem-solving attitude.Excellent organizational and communication skills – especially when working across teams. High attention to detail and ownership over internal tooling/processes. Nice to Have Prior experience in churn analysis, NPS follow-ups, or CX playbook implementation. Knowledge of basic automation tools (Zapier, Make, etc.). Experience working with SaaS or B2B accounts and handling post-sales operations. Exposure to CRM tools and customer lifecycle tracking. Why Join Zoca? Build something meaningful: Help shape a platform that’s transforming how local service businesses grow online. Own the narrative: You won’t just execute — you’ll help define the brand, category, and customer journey from the ground up. Join early, grow fast: Be part of a high-impact core team in a fast-moving, early-stage startup. Collaborate closely: Work side by side with product, growth, and leadership in our Bangalore office. Led by vision: Founded by Ashish Verma — a growth-focused SaaS entrepreneur with a strong track record in building marketing-led products for small businesses. Skills: ticketing tools,crm tools,organizational skills,sql,attention to detail,excel,ms excel,problem-solving,communication,analytical mindset,data analysis,google sheets,task automation tools,communication skills Show more Show less

Posted 20 hours ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Overview: We are looking for a motivated and detail-oriented Business Analyst Intern to join our team. As a Business Analyst Intern, you will work closely with cross-functional teams, dive deep into data, uncover insights, and help drive business decisions. If you are analytical, love problem-solving, and have a strong command of SQL, we’d love to meet you! Key Responsibilities: Analyze large datasets to extract actionable insights and support business growth initiatives. Write complex SQL queries to retrieve and manipulate data. Work closely with different teams to identify opportunities and optimize processes. Assist in building dashboards, reports, and performance trackers. Conduct competitor analysis, market research, and user behavior studies. Support ongoing projects and ad-hoc data requests. Requirements: Are available for an in-office internship Tier 1 college (preferably IIT, BITS, IIM, and NIT) Proficiency in SQL (must-have) for querying and handling large datasets. Ability to synthesize complex data into actionable business insights. Excellent problem-solving and critical-thinking skills. Strong communication skills to present data-driven recommendations. Prior internship experience in business analytics or data roles is a plus. The FRND team operates six days a week, with every 1st and 3rd Saturday working About FRND FRND is redefining the way people connect by building a social platform that’s not just engaging but also safe, inclusive, and fun. We’re a rapidly growing startup with a bold mission: to transform online interactions into meaningful relationships. Why FRND? Impact at Scale: Be part of the next big wave in social connection, shaping experiences for millions across India, LATAM, and MENA. Rewarding Journey: Competitive compensation, equity options, and growth that parallels FRND’s explosive success. Learn with the Best: Collaborate directly with founders and industry pioneers, supported by stellar investors like Krafton, India Quotient, and Elevation Capital. Freedom to Thrive: Enjoy an unlimited leave policy and unparalleled ownership of your work. Product-Centric Mindset: Work in a company where products take center stage, solving unique challenges with innovative solutions. Show more Show less

Posted 20 hours ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Company Description Zenardy is a technology consulting company specializing in implementing NetSuite industry solutions for various verticals including Hi-tech, Food & Beverage, Manufacturing & Distribution, and General Business. Role Description This is a full-time on-site role for a NetSuite Analytics and Warehouse Engineer located in Chennai. The engineer will be responsible for tasks related to data engineering, data modeling, Extract Transform Load (ETL), data warehousing, and data analytics. Qualifications Data Engineering and Data Modeling skills Experience in Extract Transform Load (ETL) processes Data Warehousing and Data Analytics skills Strong problem-solving and analytical skills Proficiency in SQL and other data querying languages Experience with NetSuite or similar ERP systems is a plus Bachelor's degree in Computer Science, Information Technology, or related field Show more Show less

Posted 20 hours ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

About Zoca Zoca is a fast-growing local business marketing platform that helps salons, spas, and wellness businesses attract, convert, and retain more clients through AI-powered tools. Backed by real-time data and automation, we simplify growth for beauty professionals. Responsibilities Own the Sales Journey Cadence: Manage and streamline the journey from free trial initiation to paid conversion. Ensure timely follow-ups, track lifecycle stages, and minimize drop-offs. Analyze Conversion Patterns: Examine trial and paid user data to identify key trends, behavioral patterns, and conversion blockers. Suggest optimizations to improve conversion rates. Lead Funnel & Revenue Insights: Monitor and report on trial-to-paid funnel performance. Create actionable dashboards and summaries for sales and leadership teams. Automation & Process Improvements: Develop lightweight automations in Sheets or CRM to reduce manual follow-ups and increase visibility. Utilize formulas, filters, conditional alerts, and integrations whenever possible. Profile Audits: Regularly analyze accounts and provide feedback for the sales team based on user behavior, segment, geography, or other filters. Collaboration with Intern Teams: Drive accountability and coordination among intern sales teams. Set cadences, provide tracking tools, and follow up on actions for high-priority leads. Documentation & Attention to Detail: Maintain clear documentation of SOPs, cadences, and changes. Ensure clean data hygiene in Sheets or CRM systems for accurate reporting. Revenue-Focused Operations: Gear every process and initiative towards increasing paid conversions and improving revenue efficiency. Think like a growth operator. Requirements Strong proficiency with Google Sheets/Excel and SQL (e.g., formulas, pivots, filters, dashboards). Analytical mindset with the ability to turn data into actionable insights. Excellent verbal and written communication skills. Self-starter, proactive, and comfortable working with minimal supervision. Proven ability to coordinate with internal stakeholders (sales, interns, etc.). Attention to detail and ownership over process quality. Experience or interest in sales operations, funnel analysis, or SaaS sales is a strong plus. Nice to Have Experience with CRM tools (e.g., HubSpot, Salesforce, Zoho) or task automation tools (e.g., Zapier). Previous exposure to startup sales processes or SDR teams. Familiarity with SQL or no-code tools for basic data querying or workflow automation. Why Join Zoca? Build something meaningful: Help shape a platform that’s transforming how local service businesses grow online. Own the narrative: You won’t just execute — you’ll help define the brand, category, and customer journey from the ground up. Join early, grow fast: Be part of a high-impact core team in a fast-moving, early-stage startup. Collaborate closely: Work side by side with product, growth, and leadership in our Bangalore office. Led by vision: Founded by Ashish Verma — a growth-focused SaaS entrepreneur with a strong track record in building marketing-led products for small businesses. Skills: dashboards,documentation,automation,crm tools,sql,excel,ms excel,communication,data analysis,google sheets,task automation tools Show more Show less

Posted 21 hours ago

Apply

5.0 years

0 Lacs

Vijayawada, Andhra Pradesh, India

On-site

Linkedin logo

Company Profile: BONbLOC is a 5 -year-old, fast growing, “Great Place to work” certified, Software and Services company with a growing team of 200+ professionals working across various cities in India and US. Our software product group builds SaaS solutions to solve large scale supply chain data collection and analysis problems using Blockchain, Data Science and IOT technologies. Our services group provides dedicated offshore/onsite support to select large customers in their IT modernization efforts working on technologies such as Mainframe, AS400, Cognos, Oracle, .NET, Angular, Java, Tableau, Xamarin, Android, etc. On the software side, we go to market with our SaaS products built on blockchain, IOT and AI. We help customers monitor and track their supply chain flow with our software. On the services side, we go to market with our 'Digital and Modern' platform where we use a range of technologies from timeless traditional to JOOG (just out of git) to help customers with their modernization initiatives. We implement and support standard ERP and WMS packages, build custom web and mobile applications, help customers modernize their mainframe and as400 systems, build large scale data warehousing and generative AI based applications, cyber-security, cloud adoption and similar projects. Our mission: We will build simple, scalable solutions using Blockchain, IoT and AI Technologies that enable our customers to realize unprecedented business value year after year. Our Vision: We will become an advanced information technology company powered by happy, intellectual and extraordinarily capable people. Integrity: We will be honest and transparent in our conduct as professional individuals, groups and teams. Collaboration: We will respect and value our teammates and will always place team success over individual success. Innovation: We will act in the knowledge that only our continuous innovation can drive superior execution Excellence: We believe that our delivery quality drives customer success which in turn drives our Company success. Roles and Responsibilities: Capacity planning, creating databases and modifying the database structure. Creating, managing, and monitoring high-availability (HA) systems. Designing schema, access patterns, locking strategy, SQL development and tuning. Setup, operate, and scale a relational database in the cloud Monitoring the database, performance metrics, response times, and request rates. Securing database privileged credentials and controlling user access to databases. Planning backup and recovery strategies, Data Migration and Patching. Generating needed ad hoc reports by querying the database.2 Auditing the database log files, Troubleshooting DB errors and contacting vendors for technical support. Academic qualifications and experiences Basic Qualification: B.E/B.Tech in IT/Computers/Computer Science or master’s in computer application from a recognized University or Institution. Experience: Minimum 3 years of experience in relevant database administration domain • Thorough understanding of Microsoft SQL Server and other database systems. Expert knowledge of Database modeling and design. Knowledge on Web specific technologies like XML, Java, TCP/IP, Web Servers, Firewalls and so on. Experience in DB backup & recovery strategies and DR planning. Strong documentation /reporting skills. Capable of resolving critical issues in a time sensitive manner. Work Location : Vijayawada Employment Type : Full-Time Experience : 3-5 years Show more Show less

Posted 21 hours ago

Apply

6.0 - 9.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Position Name : Data Engineer Position Level : Senior Position Details EY’s GDS Assurance Digital team’s mission is to develop, implement and integrate technology solutions that better serve our audit clients and engagement teams. As a member of EY’s core Assurance practice, you’ll develop a deep Audit related technical knowledge and outstanding database, data analytics and programming skills. Ever-increasing regulations require audit departments to gather, organize and analyse more data than ever before. Often the data necessary to satisfy these ever-increasing and complex regulations must be collected from a variety of systems and departments throughout an organization. Effectively and efficiently handling the variety and volume of data is often extremely challenging and time consuming for a company. EY's GDS Assurance Digital team members work side-by-side with the firm's partners, clients and audit technical subject matter experts to develop and incorporate technology solutions that enhance value-add, improve efficiencies and enable our clients with disruptive and market leading tools supporting Assurance. GDS Assurance Digital provides solution architecture, application development, testing and maintenance support to the global Assurance service line both on a pro-active basis and in response to specific requests. EY is currently seeking a Big Data Developer to join the GDS Assurance Digital practice in Bangalore, India, to work on various Microsoft technology-based projects for customers across the globe. Qualifications Requirements (including experience, skills and additional qualifications) A Bachelor's degree (BE/BTech/MCA & MBA) in Computer Science, Engineering, Information Systems Management, Accounting, Finance or a related field with adequate industry experience. BE/BTech/MCA with a sound industry experience of 6 to 9 years. Technical skills requirements: Experience with SQL, NoSQL databases such as HBase/Cassandra/MongoDB Good knowledge of Big Data querying tools, such as Pig, Hive ETL Implementation any tool like Alteryx or Azure Data Factory etc Good to have experience in NiFi Experience in any one of the reporting tool like Power BI/Tableau/Spot fire is must Analytical/Decision Making Responsibilities: An ability to quickly understand complex concepts and use technology to support data modeling, analysis, visualization or process automation Selects appropriately from applicable standards, methods, tools and applications and uses accordingly Ability to work within a multi-disciplinary team structure, but also independently Demonstrates analytical and systematic approach to problem solving Communicates fluently orally and in writing and can present complex technical information to both technical and non-technical audiences Able to plan, schedule and monitor work activities in to meet time and quality targets Able to absorb rapidly new technical information, business acumen, and apply it effectively Ability to work in a team environment with strong customer focus, good listening, negotiation and problem-resolution skills Additional skills requirements: The expectations are that a Senior will be able to maintain long-term client relationships and network and cultivate business development opportunities Should have understanding and experience of software development best practices Must be a team player EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 21 hours ago

Apply

3.0 years

0 Lacs

India

Remote

Linkedin logo

Location: Remote (India); Chennai preferred Company: Neodonya Recruit About Neodonya Recruit: Neodonya is transforming technical hiring. Our platform uses semantic CV matching, AI avatars for interviews, and competency-based evaluation to help companies hire better, faster, and smarter. We combine React-based frontend, AI-driven logic, and video workflows into a seamless experience for candidates and recruiters. Role Overview: We’re looking for a Software Engineer who can own our codebase and take it to the next level. This means ensuring a rock-solid production environment, managing Git branches and deployments, and contributing directly to feature development. You'll work across our React-based frontend , AI components (LLMs, RAG), and backend services to create a fast, reliable, and intelligent hiring platform. What You’ll Do: Own codebase reliability : Ensure that the production branch is stable, scalable, and well-documented. Write code that ships : Design and implement new features in React and backend components. Architect for scale : Proactively improve system design and modularity to handle growing usage. Integrate AI tools : Work with large language models (LLMs), RAG pipelines, and other AI tools. Manage releases : Oversee Git workflows, pull requests, commits, and CI/CD pipelines. Solve problems quickly : Debug and resolve live issues with urgency and discipline. Collaborate with the founding team to prioritize technical debt vs. new features. You’ll Succeed If You Have: 3+ years of full-stack or frontend engineering experience. Strong experience with React (Hooks, component architecture, performance tuning). Experience deploying production systems with reliability and Git best practices. Comfort working with APIs, databases , and Node.js or similar backend environments. Understanding of LLMs (e.g. OpenAI, LangChain) and how to build with RAG models. A product mindset — you care about the end user experience. Bonus Points For: Unity or video integration experience. Experience with Firebase or cloud functions. Familiarity with prompt engineering or vector database querying. Startup experience or having shipped products from scratch. Why Join Neodonya? This is a chance to shape an ambitious product at the intersection of AI and recruitment. We’re already helping hiring teams cut time-to-hire by 70% — and with your help, we’ll go even further. If you care about clean code, real impact, and building what matters — we’d love to meet you. Show more Show less

Posted 22 hours ago

Apply

5.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Job Title: Business Analyst (IT Domain – ERP & Complex Systems) Experience: 5 to 7 Years Location: Mumbai / Pune (Onsite) Employment Type: Full-Time Job Summary: We are looking for a dynamic and detail-oriented IT Business Analyst with 5–7 years of experience to join our team onsite in Mumbai or Pune. The ideal candidate must have hands-on experience in working with ERP or complex systems (excluding Insurance, Banking, Healthcare, and LMS domains) and demonstrate excellent skills in bridging business requirements with technical solutions in an Agile/Scrum environment. Key Responsibilities: Understand business needs and translate them into detailed use cases, user stories, and tasks . Liaise between clients, development teams, UX, QA, and support teams to ensure seamless communication and solution delivery. Prepare BRDs, functional specifications, flow diagrams , and support documents for development and QA teams. Collaborate closely with the UX/UI team to review and improve user experience and interface designs. Conduct and manage sprint planning, daily scrums , and proactively identify blockers. Manage and control scope changes , and maintain consistent progress reporting to stakeholders. Conduct UAT, product demos , and provide post-go-live support for issue resolution. Monitor testing tasks, audit deliverables, and ensure quality and compliance throughout the SDLC. Provide responses to routine client queries and follow up for resolution. Required Skills and Experience: 5+ years of experience as a Business Analyst in IT projects. Strong understanding of SDLC , Agile/Scrum methodologies. Proficient in using tools like Jira for backlog and project tracking. Experience with Web and Windows-based applications . Familiarity with SQL or similar querying tools for data analysis. Strong analytical, verbal, and written communication skills . Show more Show less

Posted 22 hours ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies