Jobs
Interviews

10691 Apache Jobs - Page 9

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Risk Management Level Associate Job Description & Summary A career within Internal Audit services, will provide you with an opportunity to gain an understanding of an organisation’s objectives, regulatory and risk management environment, and the diverse needs of their critical stakeholders. We focus on helping organisations look deeper and see further considering areas like culture and behaviours to help improve and embed controls. In short, we seek to address the right risks and ultimately add value to their organisation. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true saelves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within…. Responsibilities: Architecture Design: · Design and implement scalable, secure, and high-performance architectures for Generative AI applications. · Integrate Generative AI models into existing platforms, ensuring compatibility and performance optimization. Model Development and Deployment: · Fine-tune pre-trained generative models for domain-specific use cases. · Data Collection, Sanitization and Data Preparation strategy for Model fine tuning. · Well versed with machine learning algorithms like Supervised, unsupervised and Reinforcement learnings, Deep learning. · Well versed with ML models like Linear regression, Decision trees, Gradient boosting, Random Forest and K-means etc. · Evaluate, select, and deploy appropriate Generative AI frameworks (e.g., PyTorch, TensorFlow, Crew AI, Autogen, Langraph, Agentic code, Agent flow). Innovation and Strategy: · Stay up to date with the latest advancements in Generative AI and recommend innovative applications to solve complex business problems. · Define and execute the AI strategy roadmap, identifying key opportunities for AI transformation. · Good exposure to Agentic Design patterns Collaboration and Leadership: · Collaborate with cross-functional teams, including data scientists, engineers, and business stakeholders. · Mentor and guide team members on AI/ML best practices and architectural decisions. · Should be able to lead a team of data scientists, GenAI engineers and Software Developers. Performance Optimization: · Monitor the performance of deployed AI models and systems, ensuring robustness and accuracy. · Optimize computational costs and infrastructure utilization for large-scale deployments. Ethical and Responsible AI: · Ensure compliance with ethical AI practices, data privacy regulations, and governance frameworks. · Implement safeguards to mitigate bias, misuse, and unintended consequences of Generative AI. Mandatory skill sets: · Advanced programming skills in Python and fluency in data processing frameworks like Apache Spark. · Experience with machine learning, artificial Intelligence frameworks models and libraries (TensorFlow, PyTorch, Scikit-learn, etc.). · Should have strong knowledge on LLM’s foundational model (OpenAI GPT4o, O1, Claude, Gemini etc), while need to have strong knowledge on opensource Model’s like Llama 3.2, Phi etc. · Proven track record with event-driven architectures and real-time data processing systems. · Familiarity with Azure DevOps and other LLMOps tools for operationalizing AI workflows. · Deep experience with Azure OpenAI Service and vector DBs, including API integrations, prompt engineering, and model fine-tuning. Or equivalent tech in AWS/GCP. · Knowledge of containerization technologies such as Kubernetes and Docker. · Comprehensive understanding of data lakes and strategies for data management. · Expertise in LLM frameworks including Langchain, Llama Index, and Semantic Kernel. · Proficiency in cloud computing platforms such as Azure or AWS. · Exceptional leadership, problem-solving, and analytical abilities. · Superior communication and collaboration skills, with experience managing high-performing teams. · Ability to operate effectively in a dynamic, fast-paced environment. Preferred skill sets: · Experience with additional technologies such as Datadog, and Splunk. · Programming languages like C#, R, Scala · Possession of relevant solution architecture certificates and continuous professional development in data engineering and Gen AI. Years of experience required: 0-1 Years Education qualification: · BE / B.Tech / MCA / M.Sc / M.E / M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor in Business Administration, Master of Business Administration, Bachelor of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Java Optional Skills Accepting Feedback, Accepting Feedback, Accounting and Financial Reporting Standards, Active Listening, Artificial Intelligence (AI) Platform, Auditing, Auditing Methodologies, Business Process Improvement, Communication, Compliance Auditing, Corporate Governance, Data Analysis and Interpretation, Data Ingestion, Data Modeling, Data Quality, Data Security, Data Transformation, Data Visualization, Emotional Regulation, Empathy, Financial Accounting, Financial Audit, Financial Reporting, Financial Statement Analysis, Generally Accepted Accounting Principles (GAAP) {+ 19 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 2 days ago

Apply

0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Deliver performance focused backend system solutions mostly in Java. Build and maintain new and existing applications using Java Object-Oriented software analysis and design Solid understanding of object oriented programming and data modelling Experience with networking, and distributed system Experience with and appreciation for automated testing Experience with cloud compute, virtualisation and automation, using Kubernetes and AWS Preferable if you have exposure with open-source applications, e.g. Cassandra and Apache Flink B.S./MS/PhD in Computer Science or related field or equivalent experience Proven experience solving problems in complex domains Proactively identify and manage risks, including assessing and controlling risks of various kinds and apply this appropriately to diverse situations Displays courage and willing to always contribute constructive feedback - not being afraid to highlight issues and challenges and bringing alternative solutions to the table

Posted 2 days ago

Apply

7.0 - 10.0 years

0 Lacs

Chandigarh

On-site

bebo Technologies is a leading complete software solution provider. bebo stands for 'be extension be offshore'. We are a business partner of QASource, inc. USA[www.QASource.com]. We offer outstanding services in the areas of software development, sustenance engineering, quality assurance and product support. bebo is dedicated to provide high-caliber offshore software services and solutions. Our goal is to 'Deliver in time-every time'. For more details visit our website: www.bebotechnologies.com Let's have a 360 tour of our bebo premises by clicking on below link: https://www.youtube.com/watch?v=S1Bgm07dPmMKey Required Skills: Bachelor's or Master’s degree in Computer Science, Data Science, or related field. 7–10 years of industry experience, with at least 5 years in machine learning roles. Advanced proficiency in Python and common ML libraries: TensorFlow, PyTorch, Scikit-learn. Experience with distributed training, model optimization (quantization, pruning), and inference at scale. Hands-on experience with cloud ML platforms: AWS (SageMaker), GCP (Vertex AI), or Azure ML. Familiarity with MLOps tooling: MLflow, TFX, Airflow, or Kubeflow; and data engineering frameworks like Spark, dbt, or Apache Beam. Strong grasp of CI/CD for ML, model governance, and post-deployment monitoring (e.g., data drift, model decay). Excellent problem-solving, communication, and documentation skills.

Posted 2 days ago

Apply

0 years

3 - 6 Lacs

Hyderābād

On-site

Company Description Experian is a global data and technology company, powering opportunities for people and businesses around the world. We help to redefine lending practices, uncover and prevent fraud, simplify healthcare, create marketing solutions, all using our unique combination of data, analytics and software. We also assist millions of people to realise their financial goals and help them save time and money. We operate across a range of markets, from financial services to healthcare, automotive, agribusiness, insurance, and many more industry segments. We invest in people and new advanced technologies to unlock the power of data. As a FTSE 100 Index company listed on the London Stock Exchange (EXPN), we have a team of 22,500 people across 32 countries. Our corporate headquarters are in Dublin, Ireland. Learn more at experianplc.co m Job Description Job description We are looking for a Senior Software Engineer to join our Ascend Cloud Foundation Platform team. Background: We unlock the power of data to create opportunities for consumers, businesses and society. At life’s big moments – from buying a home or car, to sending a child to university, to growing a business exponentially by connecting it with new customers – we empower consumers and our clients to manage their data with confidence so they can maximize every opportunity. We require a senior software engineer in Hyderabad, India to work alongside our UK colleagues to deliver business outcomes for the UK&I region. You will join an established agile technical team, where you will work with the Lead Engineer and Product Owner to help develop the consumer data attributes, work with data analytics to validate the accuracy of the calculations whilst ensuring that you work to the highest technical standards. Key responsibilities: Design, develop, and maintain scalable and efficient data pipelines and ETL processes to extract, transform, and load data from various sources into our data lake or warehouse. Collaborate with cross-functional teams including data scientists, analysts, and software engineers to understand data requirements, define data models, and implement solutions that meet business needs. Ensure the security, integrity, and quality of data throughout the data lifecycle, implementing best practices for data governance, encryption, and access control. Develop and maintain data infrastructure components such as data warehouses, data lakes, and data processing frameworks, leveraging cloud services (e.g., AWS, Azure, GCP) and containerization technologies (e.g., Docker, Kubernetes). Implement monitoring, logging, and alerting mechanisms to ensure the reliability and availability of data pipelines and systems, and to proactively identify and address issues. Work closely with stakeholders to understand business requirements, prioritize tasks, and deliver solutions in a timely manner within an Agile working environment. Collaborate with the risk, security and compliance teams to ensure adherence to regulatory requirements (e.g., GDPR, PCI DSS) and industry standards related to data privacy and security. Stay updated on emerging technologies, tools, and best practices in the field of data engineering, and propose innovative solutions to improve efficiency, performance, and scalability. Mentor and coach junior engineers, fostering a culture of continuous learning and professional development within the team. Participate in code reviews, design discussions, and other Agile ceremonies to promote collaboration, transparency, and continuous improvement. Qualifications Qualifications Qualified to Degree, HND or HNC standard in a software engineering and/or data engineering discipline or can demonstrate commercial experience Required skills/ experience: Experience of the full development lifecycle Strong communication skills with the ability to explain solutions to technical and non-technical audiences Write clean, scalable and re-usable code that implements SOLID principles, common design patterns where applicable and adheres to published coding standards Excellent attention to detail, ability to analyse, investigate and compare large data sets when required. 3 or more years of programming using Scala 2 or more years of programming using Python Some experience of using Terraform to provision and deploy cloud services and components Experience of developing on Apache Spark Experience of developing with AWS cloud services including (but not limited to) AWS Glue, S3, Step Functions, Lambdas, EventBridge and SQS BDD / TDD experience Jenkins CI / CD experience Application Lifecycle Management Tools - BitBucket & Jira Performing Pull Request reviews Understanding of Agile methodologies Automated Testing Tools Advantageous experience: Mentoring or coaching junior engineers Cloud Solution Architecture Document databases Relational Databases Experience with Container technologies (e.g. Kubernetes) Would consider alternative skills and experience: Java (rather than Scala) Google Cloud or Microsoft Azure (rather than AWS) Azure Pipelines or TeamCity (rather than Jenkins) Github (rather than BitBucket) Azure DevOps (rather than Jira) CloudFormation (rather than Terraform) Additional Information Our uniqueness is that we celebrate yours. Experian's culture and people are important differentiators. We take our people agenda very seriously and focus on what matters; DEI, work/life balance, development, authenticity, collaboration, wellness, reward & recognition, volunteering... the list goes on. Experian's people first approach is award-winning; World's Best Workplaces™ 2024 (Fortune Global Top 25), Great Place To Work™ in 24 countries, and Glassdoor Best Places to Work 2024 to name a few. Check out Experian Life on social or our Careers Site and Glassdoor to understand why. Experian is proud to be an Equal Opportunity and Affirmative Action employer. Innovation is a critical part of Experian's DNA and practices, and our diverse workforce drives our success. Everyone can succeed at Experian and bring their whole self to work, irrespective of their gender, ethnicity, religion, color, sexuality, physical ability or age. If you have a disability or special need that requires accommodation, please let us know at the earliest opportunity. Benefits Experian care for employee's work life balance, health, safety and wellbeing. In support of this endeavor, we offer best-in-class family well-being benefits, enhanced medical benefits and paid time off. Experian Careers - Creating a better tomorrow together Find out what its like to work for Experian by clicking here

Posted 2 days ago

Apply

5.0 - 7.0 years

4 - 10 Lacs

Hyderābād

On-site

Description The U.S. Pharmacopeial Convention (USP) is an independent scientific organization that collaborates with the world's top authorities in health and science to develop quality standards for medicines, dietary supplements, and food ingredients. USP's fundamental belief that Equity = Excellence manifests in our core value of Passion for Quality through our more than 1,300 hard-working professionals across twenty global locations to deliver the mission to strengthen the supply of safe, quality medicines and supplements worldwide. At USP, we value inclusivity for all. We recognize the importance of building an organizational culture with meaningful opportunities for mentorship and professional growth. From the standards we create, the partnerships we build, and the conversations we foster, we affirm the value of Diversity, Equity, Inclusion, and Belonging in building a world where everyone can be confident of quality in health and healthcare. USP is proud to be an equal employment opportunity employer (EEOE) and affirmative action employer. We are committed to creating an inclusive environment in all aspects of our work—an environment where every employee feels fully empowered and valued irrespective of, but not limited to, race, ethnicity, physical and mental abilities, education, religion, gender identity, and expression, life experience, sexual orientation, country of origin, regional differences, work experience, and family status. We are committed to working with and providing reasonable accommodation to individuals with disabilities. Brief Job Overview The Digital & Innovation group at USP is seeking a Full Stack Developers with programming skills in Cloud technologies to be able to build innovative digital products. We are seeking someone who understands the power of Digitization and help drive an amazing digital experience to our customers. How will YOU create impact here at USP? In this role at USP, you contribute to USP's public health mission of increasing equitable access to high-quality, safe medicine and improving global health through public standards and related programs. In addition, as part of our commitment to our employees, Global, People, and Culture, in partnership with the Equity Office, regularly invests in the professional development of all people managers. This includes training in inclusive management styles and other competencies necessary to ensure engaged and productive work environments. The Sr. Software Engineer/Software Engineer has the following responsibilities: Build scalable applications/ platforms using cutting edge cloud technologies. Constantly review and upgrade the systems based on governance principles and security policies. Participate in code reviews, architecture discussions, and agile development processes to ensure high-quality, maintainable, and scalable code. Document and communicate technical designs, processes, and solutions to both technical and non-technical stakeholders Who is USP Looking For? The successful candidate will have a demonstrated understanding of our mission, commitment to excellence through inclusive and equitable behaviors and practices, ability to quickly build credibility with stakeholders, along with the following competencies and experience: Education Bachelor's or Master's degree in Computer Science, Engineering, or a related field Experience Sr. Software Engineer: 5-7 years of experience in software development, with a focus on cloud computing Software Engineer: 2-4 years of experience in software development, with a focus on cloud computing Strong knowledge of cloud platforms (e.g., AWS , Azure, Google Cloud) and services, including compute, storage, networking, and security Extensive knowledge on Java spring boot applications and design principles. Strong programming skills in languages such as Python Good experience with AWS / Azure services, such as EC2, S3, IAM, Lambda, RDS, DynamoDB, API Gateway, and Cloud Formation Knowledge of cloud architecture patterns, best practices, and security principles Familiarity with data pipeline / ETL / Orchestration tools, such as Apache NiFi, AWS Glue, or Apache Airflow. Good experience with front end technologies like React.js/Node.js etc Strong experience in micro services, automated testing practices. Experience leading initiatives related to continuous improvement or implementation of new technologies. Works independently on most deliverables Strong analytical and problem-solving skills, with the ability to develop creative solutions to complex problems Ability to manage multiple projects and priorities in a fast-paced, dynamic environment Additional Desired Preferences Experience with scientific chemistry nomenclature or prior work experience in life sciences, chemistry, or hard sciences or degree in sciences Experience with pharmaceutical datasets and nomenclature Experience with containerization technologies, such as Docker and Kubernetes, is a plus Experience working with knowledge graphs Ability to explain complex technical issues to a non-technical audience Self-directed and able to handle multiple concurrent projects and prioritize tasks independently Able to make tough decisions when trade-offs are required to deliver results Strong communication skills required: Verbal, written, and interpersonal Supervisory Responsibilities No Benefits USP provides the benefits to protect yourself and your family today and tomorrow. From company-paid time off and comprehensive healthcare options to retirement savings, you can have peace of mind that your personal and financial well-being is protected Who is USP? The U.S. Pharmacopeial Convention (USP) is an independent scientific organization that collaborates with the world's top authorities in health and science to develop quality standards for medicines, dietary supplements, and food ingredients. USP's fundamental belief that Equity = Excellence manifests in our core value of Passion for Quality through our more than 1,300 hard-working professionals across twenty global locations to deliver the mission to strengthen the supply of safe, quality medicines and supplements worldwide. At USP, we value inclusivity for all. We recognize the importance of building an organizational culture with meaningful opportunities for mentorship and professional growth. From the standards we create, the partnerships we build, and the conversations we foster, we affirm the value of Diversity, Equity, Inclusion, and Belonging in building a world where everyone can be confident of quality in health and healthcare. USP is proud to be an equal employment opportunity employer (EEOE) and affirmative action employer. We are committed to creating an inclusive environment in all aspects of our work—an environment where every employee feels fully empowered and valued irrespective of, but not limited to, race, ethnicity, physical and mental abilities, education, religion, gender identity, and expression, life experience, sexual orientation, country of origin, regional differences, work experience, and family status. We are committed to working with and providing reasonable accommodation to individuals with disabilities.

Posted 2 days ago

Apply

6.0 years

3 - 6 Lacs

Hyderābād

Remote

The Platform Engineer is responsible for designing, implementing and maintaining scalable, secure and highly available Linux-based systems and DevOps pipelines. This role requires close collaboration with cross-functional teams to align infrastructure capabilities with business goals. What you’ll do: Engineer and manage scalable Linux-based infrastructure within a DevOps framework including core services such as web servers (Nginx/Apache), FTP, DNS, SSH etc. Automate infrastructure provisioning and configuration using tools like Ansible, Terraform or Puppet. Build and maintain CI/CD pipelines using tools such as Jenkins, GitLab CI, or GitHub Actions. Monitor system performance and analyze performance metric availability using tools like Nagios, Solarwinds, Prometheus, Grafana etc Actively participate in incident management processes, quickly identifying and resolving issues. Conduct root cause analysis to prevent future incidents Collaborate with development teams to ensure seamless integration and deployment of applications Apply security best practices to harden systems and manage vulnerabilities. Create and maintain documentation for systems, processes and procedures to ensure knowledge sharing across teams Participate in on-call rotations Stay updated on industry trends and emerging technologies What you’ll bring: 6+ years of experience leveraging automation to manage Linux systems and infrastructure, specifically RedHat In-depth knowledge of cloud platforms such as AWS and Azure Proficiency with infrastructure as code (IaC) tools such as Terraform and CloudFormation Strong technical experience implementing, managing and supporting Linux systems infrastructure Proficiency in one or more programming languages (Python, Powershell, etc) Ability to deliver software which meets consistent standards of quality, security and operability. Able to work flexible hours as required by business priorities; Available on a 24x7x365 basis when needed for production impacting incidents or key customer events Stay up to date on everything Blackbaud, Blackbaud is a digital-first company which embraces a flexible remote or hybrid work culture. Blackbaud supports hiring and career development for all roles from the location you are in today! Blackbaud is proud to be an equal opportunity employer and is committed to maintaining an inclusive work environment. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, physical or mental disability, age, or veteran status or any other basis protected by federal, state, or local law.

Posted 2 days ago

Apply

4.0 years

3 - 8 Lacs

Hyderābād

On-site

This role is accountable for delivering quality and performance of large-scale, multi-platform software products that include web and API applications. The focus is on developing automated functional, integration, and end-to-end tests using open-source test frameworks to support overall system testing within an Agile environment. Bachelor’s degree in computer science or related field or equivalent experience 4+ years of proven experience in the software development industry, working in collaborative team environments 4+ years of experience using automation tools such as Selenium WebDriver with programming languages like Python, C#, or Java 3+ years of hands-on experience testing and automating web services, including RESTful APIs 2+ years of experience in performance testing using tools such as Apache JMeter Experience with object-oriented programming languages such as Java and C#/.NET Experience in CI/CD technologies such as Bamboo, Bitbucket, Octopus Deploy, and Maven Working knowledge of API testing tools such as RestAssured Knowledge of software engineering best practices across the full software development lifecycle, including coding standards, code reviews, source control, build processes, testing, and operations Experience or familiarity with AWS cloud services Strong written and verbal communication skills Proven ability to learn new technologies and adapt in a dynamic environment Familiarity with Atlassian tools, including Jira and Confluence Working knowledge of Agile methodologies, particularly Scrum Experience operating in a Continuous Integration (CI) environment For over 50 years, Verisk has been the leading data analytics and technology partner to the global insurance industry by delivering value to our clients through expertise and scale. We empower communities and businesses to make better decisions on risk, faster. At Verisk, you'll have the chance to use your voice and build a rewarding career that's as unique as you are, with work flexibility and the support, coaching, and training you need to succeed. For the eighth consecutive year, Verisk is proudly recognized as a Great Place to Work® for outstanding workplace culture in the US, fourth consecutive year in the UK, Spain, and India, and second consecutive year in Poland. We value learning, caring and results and make inclusivity and diversity a top priority. In addition to our Great Place to Work® Certification, we’ve been recognized by The Wall Street Journal as one of the Best-Managed Companies and by Forbes as a World’s Best Employer and Best Employer for Women, testaments to the value we place on workplace culture. We’re 7,000 people strong. We relentlessly and ethically pursue innovation. And we are looking for people like you to help us translate big data into big ideas. Join us and create an exceptional experience for yourself and a better tomorrow for future generations. Verisk Businesses Underwriting Solutions — provides underwriting and rating solutions for auto and property, general liability, and excess and surplus to assess and price risk with speed and precision Claims Solutions — supports end-to-end claims handling with analytic and automation tools that streamline workflow, improve claims management, and support better customer experiences Property Estimating Solutions — offers property estimation software and tools for professionals in estimating all phases of building and repair to make day-to-day workflows the most efficient Extreme Event Solutions — provides risk modeling solutions to help individuals, businesses, and society become more resilient to extreme events. Specialty Business Solutions — provides an integrated suite of software for full end-to-end management of insurance and reinsurance business, helping companies manage their businesses through efficiency, flexibility, and data governance Marketing Solutions — delivers data and insights to improve the reach, timing, relevance, and compliance of every consumer engagement Life Insurance Solutions – offers end-to-end, data insight-driven core capabilities for carriers, distribution, and direct customers across the entire policy lifecycle of life and annuities for both individual and group. Verisk Maplecroft — provides intelligence on sustainability, resilience, and ESG, helping people, business, and societies become stronger Verisk Analytics is an equal opportunity employer. All members of the Verisk Analytics family of companies are equal opportunity employers. We consider all qualified applicants for employment without regard to race, religion, color, national origin, citizenship, sex, gender identity and/or expression, sexual orientation, veteran's status, age or disability. Verisk’s minimum hiring age is 18 except in countries with a higher age limit subject to applicable law. https://www.verisk.com/company/careers/ Unsolicited resumes sent to Verisk, including unsolicited resumes sent to a Verisk business mailing address, fax machine or email address, or directly to Verisk employees, will be considered Verisk property. Verisk will NOT pay a fee for any placement resulting from the receipt of an unsolicited resume. Verisk Employee Privacy Notice

Posted 2 days ago

Apply

3.0 years

3 - 4 Lacs

India

On-site

Job Title: Linux Server AdministratorJob Summary: We are seeking a highly skilled Linux System Administrator with 3+ years of hands-on experience in managing Linux-based server environments. The ideal candidate will be responsible for system setup, maintenance, monitoring, and ensuring optimal performance of Linux servers (Ubuntu/RHEL), along with managing applications, services, storage, and backup processes. Key Responsibilities: Install and configure Linux servers (Ubuntu, RHEL) for production, development, and backup environments. Configure and manage RAID levels for data redundancy and performance. Create and maintain virtual machines (VMs) using virtualization tools like KVM, VirtualBox, or VMware. Monitor system health, resource usage, and uptime; respond to incidents and performance alerts. Maintain and manage SAMBA servers for file sharing across platforms. Configure and manage Squid/Proxy servers for secure and optimized internet access. Install, configure, and maintain web servers such as Apache , Tomcat , and Nginx for hosting applications. Install and maintain PostgreSQL databases , including performance tuning, backup, and replication. Handle SSL certificate installation and renewal ; link SSL to appropriate web services. Manage back-up servers , perform scheduled backups, and implement disaster recovery procedures. Implement and monitor data replication strategies for critical systems. Perform regular system tasks such as user creation, deletion, password resets, and permission management . Carry out data cleaning and disk space management to maintain optimal storage use. Collaborate with application and development teams to support deployments and integrations. Required Skills: Strong expertise in Linux server administration (Ubuntu, RHEL). Hands-on experience with RAID configuration and storage management. Proficient in VM creation , cloning, and maintenance. Practical knowledge of SAMBA , Squid/Proxy , and web servers (Apache, Tomcat, Nginx). Experience in PostgreSQL installation , backup, and replication. Working knowledge of SSL certificates (installation, renewal, and linking). Familiarity with backup tools and server disaster recovery planning . Good understanding of replication and monitoring tools . Experience with shell scripting for automation is a plus. Education Requirements: B.Tech / MCA (Master of Computer Applications) Preferred Skills (Good to Have): Experience with cloud infrastructure (AWS, Azure, GCP). Familiarity with Docker , Kubernetes , and DevOps practices . Exposure to monitoring tools (Nagios, Zabbix, Prometheus, etc.). Basic knowledge of networking concepts and firewall rules . Job Type: Full-time Pay: ₹300,000.00 - ₹400,000.00 per year Work Location: In person

Posted 2 days ago

Apply

5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Websphere & Weblogic Admin Location -Noida -Hybrid Looking for immediate joiners only NTTDATA - Kindly 5+ years only apply . Websphere & Weblogic Admin: 5+ years Apache with strong Linux and ansible skills Location: Noida About The Client: A global IT services and consulting company, multinational information technology (IT), headquartered in Tokyo, Japan. The Client offers a wide array of IT services, including application development, infrastructure management, and business process outsourcing. Their consulting services span business and technology, while their digital solutions focus on transformation and user experience design. It excels in data and intelligence services, emphasizing analytics, AI, and machine learning. Additionally, their cybersecurity, cloud, and application services round out a comprehensive portfolio designed to meet the diverse needs of businesses worldwide. How to Apply: Interested candidates are invited to submit their resume using the apply online button on this job post. About VARITE: VARITE is a global staffing and IT consulting company providing technical consulting and team augmentation services to Fortune 500 Companies in USA, UK, CANADA and INDIA. VARITE is currently a primary and direct vendor to the leading corporations in the verticals of Networking, Cloud Infrastructure, Hardware and Software, Digital Marketing and Media Solutions, Clinical Diagnostics, Utilities, Gaming and Entertainment, and Financial Services. Equal Opportunity Employer: VARITE is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. We do not discriminate on the basis of race, color, religion, sex, sexual orientation, gender identity or expression, national origin, age, marital status, veteran status, or disability status. Unlock Rewards: Refer Candidates and Earn. If you're not available or interested in this opportunity, please pass this along to anyone in your network who might be a good fit and interested in our open positions. VARITE offers a Candidate Referral program, where you'll receive a one-time referral bonus based on the following scale if the referred candidate completes a three-month assignment with VARITE. Exp Req - Referral Bonus 0 - 2 Yrs. - INR 5,000 2 - 6 Yrs. - INR 7,500 6 + Yrs. - INR 10,000

Posted 2 days ago

Apply

2.0 - 4.0 years

6 - 9 Lacs

Hyderābād

On-site

Summary As a Data Analyst, you will be responsible for Design, develop, and maintain efficient and scalable data pipelines for data ingestion, transformation, and storage. About the Role Location – Hyderabad #LI Hybrid About the Role: As a Data Analyst, you will be responsible for Design, develop, and maintain efficient and scalable data pipelines for data ingestion, transformation, and storage. Key Responsibilities: Design, develop, and maintain efficient and scalable data pipelines for data ingestion, transformation, and storage. Collaborate with cross-functional teams, including data analysts, business analyst and BI, to understand data requirements and design appropriate solutions. Build and maintain data infrastructure in the cloud, ensuring high availability, scalability, and security. Write clean, efficient, and reusable code in scripting languages, such as Python or Scala, to automate data workflows and ETL processes. Implement real-time and batch data processing solutions using streaming technologies like Apache Kafka, Apache Flink, or Apache Spark. Perform data quality checks and ensure data integrity across different data sources and systems. Optimize data pipelines for performance and efficiency, identifying and resolving bottlenecks and performance issues. Collaborate with DevOps teams to deploy, automate, and maintain data platforms and tools. Stay up to date with industry trends, best practices, and emerging technologies in data engineering, scripting, streaming data, and cloud technologies Essential Requirements: Bachelor's or Master's degree in Computer Science, Information Systems, or a related field with an overall experience of 2-4 Years. Proven experience as a Data Engineer or similar role, with a focus on scripting, streaming data pipelines, and cloud technologies like AWS, GCP or Azure. Strong programming and scripting skills in languages like Python, Scala, or SQL. Experience with cloud-based data technologies, such as AWS, Azure, or Google Cloud Platform. Hands-on experience with streaming technologies, such as AWS Streamsets, Apache Kafka, Apache Flink, or Apache Spark Streaming. Strong experience with Snowflake (Required) Proficiency in working with big data frameworks and tools, such as Hadoop, Hive, or HBase. Knowledge of SQL and experience with relational and NoSQL databases. Familiarity with data modelling and schema design principles. Strong problem-solving skills and the ability to work in a fast-paced, collaborative environment. Excellent communication and teamwork skills. Commitment to Diversity and Inclusion: Novartis is committed to building an outstanding, inclusive work environment and diverse teams' representative of the patients and communities we serve. Accessibility and accommodation: Novartis is committed to working with and providing reasonable accommodation to individuals with disabilities. If, because of a medical condition or disability, you need a reasonable accommodation for any part of the recruitment process, or in order to perform the essential functions of a position, please send an e-mail to diversityandincl.india@novartis.com and let us know the nature of your request and your contact information. Please include the job requisition number in your message Why Novartis: Helping people with disease and their families takes more than innovative science. It takes a community of smart, passionate people like you. Collaborating, supporting and inspiring each other. Combining to achieve breakthroughs that change patients’ lives. Ready to create a brighter future together? https://www.novartis.com/about/strategy/people-and-culture Join our Novartis Network: Not the right Novartis role for you? Sign up to our talent community to stay connected and learn about suitable career opportunities as soon as they come up: https://talentnetwork.novartis.com/network Benefits and Rewards: Read our handbook to learn about all the ways we’ll help you thrive personally and professionally: https://www.novartis.com/careers/benefits-rewards Division US Business Unit Universal Hierarchy Node Location India Site Hyderabad (Office) Company / Legal Entity IN10 (FCRS = IN010) Novartis Healthcare Private Limited Functional Area Marketing Job Type Full time Employment Type Regular Shift Work No Accessibility and accommodation Novartis is committed to working with and providing reasonable accommodation to individuals with disabilities. If, because of a medical condition or disability, you need a reasonable accommodation for any part of the recruitment process, or in order to perform the essential functions of a position, please send an e-mail to [email protected] and let us know the nature of your request and your contact information. Please include the job requisition number in your message. Novartis is committed to building an outstanding, inclusive work environment and diverse teams' representative of the patients and communities we serve.

Posted 2 days ago

Apply

0.0 - 5.0 years

3 - 8 Lacs

New Town, Kolkata, West Bengal

On-site

Experience 5+ yrs Location Kolkata WFO only, 5 Days Working MERN Stack Developer We are looking for a MERN Stack Developer to build scalable software solutions. You’ll be part of a cross-functional team that’s responsible for the full software development life cycle, from conception to deployment.As a MERN Stack Developer, you should be comfortable around both front-end and back-end coding languages, development frameworks and third-party libraries. You should also be a team player with a knack for visual design and utility.Responsibilities● Work with development teams and product managers to ideate software solutions● Design client-side and server-side architecture● Build the front-end of applications through appealing visual design● Develop and manage well-functioning databases and applications● Design and develop secure and high performing backend API that can be consumed by any platform like mobile apps, website etc.● Test software to ensure responsiveness and efficiency● Troubleshoot, debug and upgrade software● Build features and applications with a mobile responsive design● Write technical documentation● Work with data scientists and analysts to improve software● Hands-on experience in deploying applications to cloud servers● Familiar with CI/CD integration will be a plus.● Takes ownership of tasks and drives them to completion.● Ensure the best possible performance, quality, and responsiveness of front end and backend applications.● Identify bottlenecks and bugs, and devise solutions to the problems & issues.Requirements● Proven experience as a Full Stack Developer or similar role● Experience developing backend for web and mobile applications● Knowledge of multiple front-end languages and libraries (e.g. HTML/ CSS, JavaScript, XML, jQuery)● Proficient in modern JavaScript framework React..● Knowledge of multiple back-end JavaScript frameworks (e.g. Node.js)● Familiarity with databases (e.g. MySQL, MongoDB), web servers (e.g. Apache) and UI/UX design● Proficient in writing Stored Procedures, views, trigger in MySql.● Excellent communication and teamwork skills● An analytical mind● You will build robust and secure APIs using REST APIs● Experience with socket will be a big plus.● Independent thinking and fast learning capabilities● Break projects into simpler granular tasks, estimate effort required and identify dependencies● Experience in scaling web applications to deal with thousands of concurrent users is a big plus.● Experience in build progressive web app (PWA) will be a plus● Working knowledge of Agile methodologies (running Stand-up, ScrumMaster)● We require someone who understands code versioning tools, such as Git.QualificationB. E / B. Tech / M. E / M. Tech in Computer Science or Electronics and Communication / MCA / or relevant experience.Role: Full Stack DeveloperIndustry Type: IT-Software, Software ServicesFunctional Area: IT Software - Application Programming, MaintenanceEmployment Type: Full Time, PermanentRole Category: Programming & Design Job Types: Full-time, Permanent Pay: ₹300,000.00 - ₹850,000.00 per year Benefits: Health insurance Provident Fund Schedule: Monday to Friday Experience: React: 2 years (Preferred) Node.js: 5 years (Preferred) Location: New Town, Kolkata, West Bengal (Preferred) Work Location: In person

Posted 2 days ago

Apply

4.0 - 8.0 years

12 - 20 Lacs

Cochin

On-site

Job Title: Data Visualization Engineer Location: Kochi (Work From Office) Experience Level: 4–8 Years Employment Type: Full-Time Job Summary: We are seeking a skilled and detail-oriented Data Visualization Engineer to join our team in Kochi. The ideal candidate will have 4–8 years of experience in data analytics and visualization, with strong proficiency in Apache Superset . You will be responsible for transforming complex data sets into insightful dashboards and reports that drive business decisions. Key Responsibilities: Design, develop, and maintain interactive dashboards and data visualizations using Apache Superset . Work closely with data analysts, engineers, and business stakeholders to gather requirements and translate them into meaningful visual representations. Optimize performance and usability of existing dashboards and reports. Integrate data from various sources (SQL, APIs, warehouses, etc.) into Superset. Ensure data accuracy, consistency, and security in visualizations. Troubleshoot and resolve issues related to data and visualization tools. Stay updated with the latest visualization tools, trends, and best practices. Key Skills & Qualifications: Mandatory: Hands-on experience with Apache Superset (development, customization, deployment). Proficient in SQL and experience with relational databases (e.g., PostgreSQL, MySQL). Solid understanding of data modeling and data warehousing concepts. Familiarity with other BI tools is a plus (e.g., Tableau, Power BI, Looker). Strong analytical, problem-solving, and communication skills. Experience working in cross-functional teams in a fast-paced environment. Bachelor's degree in Computer Science, Data Science, Engineering, or related field. Immediate joiner apply Job Type: Full-time Pay: ₹1,200,000.00 - ₹2,000,000.00 per year Work Location: In person

Posted 2 days ago

Apply

5.0 years

1 - 9 Lacs

Gurgaon

On-site

Job Description: Senior Data Developer I Location: Gurugram, India Employment Type: Full-Time Experience Level: Mid to Senior-Level Department: Data & Analytics / IT Job Summary: We are seeking an experienced Data Developer with expertise in Microsoft Fabric, Azure Synapse Analytics, Databricks, and strong SQL development skills. The ideal candidate will work on end-to-end data solutions supporting analytics initiatives across clinical, regulatory, and commercial domains in the Life Sciences industry. Familiarity with Azure DevOps, and relevant certifications such as DP-700 and Databricks Data Engineer Associate/Professional are preferred. Power BI knowledge is highly preferable to support integrated analytics and reporting. Key Responsibilities: Design, develop, and maintain scalable and secure data pipelines using Microsoft Fabric, Azure Synapse Analytics, and Azure Databricks to support critical business processes. Develop curated datasets for clinical, regulatory, and commercial analytics using SQL and PySpark. Create and support dashboards and reports using Power BI (highly preferred). Collaborate with cross-functional stakeholders to understand data needs and translate them into technical solutions. Work closely with ERP teams such as Salesforce.com and SAP S/4HANA to integrate and transform business-critical data into analytic-ready formats. Partner with Data Scientists to enable advanced analytics and machine learning initiatives by providing clean, reliable, and well-structured data. Ensure data quality, lineage, and documentation in accordance with GxP, 21 CFR Part 11, and industry best practices. Use Azure DevOps to manage code repositories, track tasks, and support agile delivery processes. Monitor, troubleshoot, and optimize data workflows for reliability and performance. Contribute to the design of scalable, compliant data models and architecture. Required Qualifications: Bachelor’s or Master’s degree in Computer Science. 5+ years of experience in data development or data engineering roles. Hands-on experience with: Microsoft Fabric (Lakehouse, Pipelines, Dataflows) Azure Synapse Analytics (Dedicated/Serverless SQL Pools, Pipelines) Experience with Azure Data Factory, Apache Spark Azure Databricks (Notebooks, Delta Lake, Unity Catalog) SQL (complex queries, optimization, transformation logic) Familiarity with Azure DevOps (Repos, Pipelines, Boards). Understanding of data governance, security, and compliance in the Life Sciences domain. Certifications (Preferred): Microsoft Certified: DP-700 – Fabric Analytics Engineer Associate Databricks Certified Data Engineer Associate or Professional Preferred Skills: Preferred Skills: Strong knowledge of Power BI (highly preferred) Familiarity with HIPAA, GxP, and 21 CFR Part 11 compliance Experience working with ERP data from Salesforce.com and SAP S/4HANA Exposure to clinical trial, regulatory submission, or quality management data Good understanding of AI and ML concepts Experience working with APIs Excellent communication skills and the ability to collaborate across global teams Location - Gurugram Mode - Hybrid

Posted 2 days ago

Apply

0 years

0 Lacs

India

Remote

Company Description At Trigonal AI, we specialize in building and managing end-to-end data ecosystems that empower businesses to make data-driven decisions with confidence. From data ingestion to advanced analytics, we offer the expertise and technology to transform data into actionable insights. Our core services include data pipeline orchestration, real-time analytics, and business intelligence & visualization. We use modern technologies such as Apache Airflow, Kubernetes, Apache Druid, Kafka, and leading BI tools to create reliable and scalable solutions. Let us help you unlock the full potential of your data. Role Description This is a full-time remote role for a Business Development Specialist. The specialist will focus on day-to-day tasks including lead generation, market research, customer service, and communication with potential clients. The role also includes analytical tasks and collaborating with the sales and marketing teams to develop and implement growth strategies. Qualifications Strong Analytical Skills for data-driven decision-making Effective Communication skills for engaging with clients and team members Experience in Lead Generation and Market Research Proficiency in Customer Service to maintain client relationships Proactive and independent work style Experience in the tech or data industry is a plus Bachelor's degree in Business, Marketing, or related field

Posted 2 days ago

Apply

1.0 years

1 - 3 Lacs

Mohali

On-site

Job Title : Node.js Developer Experience: 1+ Years Joining : Immediate Joiners Preferred Job Description : We are seeking a talented Node.js Developer with over 1 year of experience to join our growing development team. The ideal candidate should have a strong understanding of server-side programming, RESTful APIs, database management, and experience deploying applications on cloud platforms such as AWS and DigitalOcean. You will develop backend components, manage cloud infrastructure, and ensure seamless integration with front-end applications. Key Responsibilities: Develop, test, and maintain backend services using Node.js. Create and integrate RESTful APIs. Collaborate with front-end developers and UI/UX teams. Work with databases such as MongoDB, MySQL, or PostgreSQL. Debug and troubleshoot application and server/network issues. Write clean, scalable, and maintainable code. Configure and manage cloud deployments on AWS and DigitalOcean. Implement, configure, and maintain HTTP/S proxies and reverse proxies (e.g., Nginx). Ensure application security and scalability on cloud servers. Participate in daily stand-ups and project discussions. Required Skills: 1+ years of hands-on experience in Node.js. Strong knowledge of JavaScript (ES6+). Familiarity with Express.js or similar frameworks. Experience with databases (MongoDB, MySQL, etc.). Understanding of RESTful API design. Experience deploying and managing cloud infrastructure (AWS, DigitalOcean). Knowledge of server networking principles (DNS, proxies, firewalls, reverse proxy set-up, load balancers). Basic experience with proxies and reverse proxies (Nginx/Apache/HAProxy). Ability to troubleshoot deployment and server issues. Basic version control knowledge (Git). Good communication and teamwork skills. Bonus Skills (Optional): Experience with Docker and containerization. Familiarity with CI/CD pipelines. Knowledge of server security best practices. Hands-on experience with other cloud providers. Job Type: Full-time Pay: ₹15,000.00 - ₹25,000.00 per month Schedule: Monday to Friday Morning shift Work Location: In person

Posted 2 days ago

Apply

0 years

2 - 4 Lacs

Bhubaneshwar

On-site

Key skills: PHP(Core/Framework) Laravel, Codeignator, MySQL, YII, Roles and Responsibilities; - Work with the development team and relevant client to provide software solutions. - Design client-side and server-side architecture. - Develop and manage well-functioning databases and web applications - Write effective APIs - Test software/website to ensure responsiveness and efficiency - Troubleshoot, debug and upgrade software - Create security and data protection settings - Provide daily status - Overseas exposure is an added advantage. Desired Candidate Profile-Technical Skills: - Proven experience as a PHP Full Stack Developer or similar role - Strong knowledge of PHP frameworks such as Laravel/Phalcon/Codeigniter. - Experience developing desktop and web applications. - Understanding the fully synchronous behavior of PHP. - Understanding of MVC design patterns - Knowledge of object-oriented PHP programming - Understanding fundamental design principles behind a scalable application - User authentication and authorization between multiple systems, servers, and environments. - Strong knowledge of multiple back-end languages (e.g. PHP and its Frameworks - Laravel/Phalcon/Yii, Python) - Integration of multiple data sources and databases into one system. - Creating database schemas that represent and support business processes - Familiarity with databases (e.g. MySQL, MongoDB), web servers (e.g. Apache) and their declarative query languages. Job Types: Full-time, Permanent Pay: ₹20,000.00 - ₹35,000.00 per month Benefits: Paid sick time

Posted 2 days ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Overview Leading AI-driven Global Supply Chain Solutions Software Product Company and one of Glassdoor’s “Best Places to Work” Seeking an astute individual that has a strong technical foundation with the additional ability to be hands-on with the broader engineering team as part of the development/deployment cycle, and deep knowledge of industry best practices, Data Science and Machine Learning experience with the ability to implement them working with both the platform, and the product teams. Scope Our machine learning platform ingests data in real time, processes information from millions of retail items to serve deep learning models and produces billions of predictions on a daily basis. Blue Yonder Data Science and Machine Learning team works closely with sales, product and engineering teams to design and implement the next generation of retail solutions. Data Science team members are tasked with turning both small, sparse and massive data into actionable insights with measurable improvements to the customer bottom line. Our Current Technical Environment Software: Python 3.* Frameworks/Others: TensorFlow, PyTorch, BigQuery/Snowflake, Apache Beam, Kubeflow, Apache Flink/Dataflow, Kubernetes, Kafka, Pub/Sub, TFX, Apache Spark, and Flask. Application Architecture: Scalable, Resilient, Reactive, event driven, secure multi-tenant Microservices architecture. Cloud: Azure What We Are Looking For Bachelor’s Degree in Computer Science or related fields; graduate degree preferred. Solid understanding of data science and deep learning foundations. Proficient in Python programming with a solid understanding of data structures. Experience working with most of the following frameworks and libraries: Pandas, NumPy, Keras, TensorFlow, Jupyter, Matplotlib etc. Expertise in any database query language, SQL preferred. Familiarity with Big Data tech such as Snowflake , Apache Beam/Spark/Flink, and Databricks. etc. Solid experience with any of the major cloud platforms, preferably Azure and/or GCP (Google Cloud Platform). Reasonable knowledge of modern software development tools, and respective best practices, such as Git, Jenkins, Docker, Jira, etc. Familiarity with deep learning, NLP, reinforcement learning, combinatorial optimization etc. Provable experience guiding junior data scientists in official or unofficial setting. Desired knowledge of Kafka, Redis, Cassandra, etc. What You Will Do As a Senior Data Scientist, you serve as a specialist in the team that supports the team with following responsibilities. Independently, or alongside junior scientists, implement machine learning models by Procuring data from platform, client, and public data sources. Implementing data enrichment and cleansing routines Implementing features, preparing modelling data sets, feature selection, etc. Evaluating candidate models, selecting, and reporting on test performance of final one Ensuring proper runtime deployment of models, and Implementing runtime monitoring of model inputs and performance in order to ensure continued model stability. Work with product, sales and engineering teams helping shape up the final solution. Use data to understand patterns, come up with and test hypothesis; iterate. Help prepare sales materials, estimate hardware requirements, etc. Attend client meetings, online and onsite, to discuss new and current functionality Our Values If you want to know the heart of a company, take a look at their values. Ours unite us. They are what drive our success – and the success of our customers. Does your heart beat like ours? Find out here: Core Values All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status.

Posted 2 days ago

Apply

3.0 - 5.0 years

3 - 8 Lacs

Chennai

On-site

3 - 5 Years 5 Openings Bangalore, Chennai, Kochi, Trivandrum Role description Role Proficiency: Independently develops error free code with high quality validation of applications guides other developers and assists Lead 1 – Software Engineering Outcomes: Understand and provide input to the application/feature/component designs; developing the same in accordance with user stories/requirements. Code debug test document and communicate product/component/features at development stages. Select appropriate technical options for development such as reusing improving or reconfiguration of existing components. Optimise efficiency cost and quality by identifying opportunities for automation/process improvements and agile delivery models Mentor Developer 1 – Software Engineering and Developer 2 – Software Engineering to effectively perform in their roles Identify the problem patterns and improve the technical design of the application/system Proactively identify issues/defects/flaws in module/requirement implementation Assists Lead 1 – Software Engineering on Technical design. Review activities and begin demonstrating Lead 1 capabilities in making technical decisions Measures of Outcomes: Adherence to engineering process and standards (coding standards) Adherence to schedule / timelines Adhere to SLAs where applicable Number of defects post delivery Number of non-compliance issues Reduction of reoccurrence of known defects Quick turnaround of production bugs Meet the defined productivity standards for project Number of reusable components created Completion of applicable technical/domain certifications Completion of all mandatory training requirements Outputs Expected: Code: Develop code independently for the above Configure: Implement and monitor configuration process Test: Create and review unit test cases scenarios and execution Domain relevance: Develop features and components with good understanding of the business problem being addressed for the client Manage Project: Manage module level activities Manage Defects: Perform defect RCA and mitigation Estimate: Estimate time effort resource dependence for one's own work and others' work including modules Document: Create documentation for own work as well as perform peer review of documentation of others' work Manage knowledge: Consume and contribute to project related documents share point libraries and client universities Status Reporting: Report status of tasks assigned Comply with project related reporting standards/process Release: Execute release process Design: LLD for multiple components Mentoring: Mentor juniors on the team Set FAST goals and provide feedback to FAST goals of mentees Skill Examples: Explain and communicate the design / development to the customer Perform and evaluate test results against product specifications Develop user interfaces business software components and embedded software components 5 Manage and guarantee high levels of cohesion and quality6 Use data models Estimate effort and resources required for developing / debugging features / components Perform and evaluate test in the customer or target environment Team Player Good written and verbal communication abilities Proactively ask for help and offer help Knowledge Examples: Appropriate software programs / modules Technical designing Programming languages DBMS Operating Systems and software platforms Integrated development environment (IDE) Agile methods Knowledge of customer domain and sub domain where problem is solved Additional Comments: Design, develop, and optimize large-scale data pipelines using Azure Databricks (Apache Spark). Build and maintain ETL/ELT workflows and batch/streaming data pipelines. Collaborate with data analysts, scientists, and business teams to support their data needs. Write efficient PySpark or Scala code for data transformations and performance tuning. Implement CI/CD pipelines for data workflows using Azure DevOps or similar tools. Monitor and troubleshoot data pipelines and jobs in production. Ensure data quality, governance, and security as per organizational standards. Skills Databricks,Adb,Etl About UST UST is a global digital transformation solutions provider. For more than 20 years, UST has worked side by side with the world’s best companies to make a real impact through transformation. Powered by technology, inspired by people and led by purpose, UST partners with their clients from design to operation. With deep domain expertise and a future-proof philosophy, UST embeds innovation and agility into their clients’ organizations. With over 30,000 employees in 30 countries, UST builds for boundless impact—touching billions of lives in the process.

Posted 2 days ago

Apply

0 years

3 - 6 Lacs

India

On-site

Job Summary: We are seeking a highly skilled Java Spring Boot Developer to join our development team. The ideal candidate will be responsible for designing and developing high-volume, low-latency applications for mission-critical systems, delivering high availability and performance. Key Responsibilities: Develop and maintain Java-based web applications using Spring Boot. Design and implement RESTful APIs and microservices. Collaborate with cross-functional teams to define, design, and ship new features. Write well-designed, efficient, and testable code. Participate in code reviews, unit testing, and deployment processes. Troubleshoot and resolve application issues and bugs. Ensure code quality and maintainability using industry-standard practices. Work with DevOps for CI/CD and cloud deployment. Technical Skillset: Strong proficiency in Java (8+) Hands-on experience with Spring Boot Solid understanding and experience in Hibernate / JPA Good knowledge of REST APIs, MySQL, and Microservices architecture is a plus Hands-on experience with Apache Kafka (consumers/producers Job Types: Full-time, Permanent Pay: ₹25,000.00 - ₹50,000.00 per month Benefits: Cell phone reimbursement Health insurance Provident Fund Schedule: Day shift Language: Hindi (Preferred) Work Location: In person

Posted 2 days ago

Apply

5.0 years

5 - 7 Lacs

Noida

On-site

We're looking for a highly skilled and motivated Senior MERN Full Stack Developer with minimum 5 years of experience to join our team at Help Study Abroad . You'll be responsible for designing, developing, and deploying scalable and robust web applications using the MERN stack (MongoDB, Express.js, React.js, Node.js). This role requires a strong understanding of microservices architecture, integrating advanced technologies like AI and Elastic search, and demonstrating proficiency in DevOps practices. Responsibilities: Develop and maintain backend microservices for user authentication (JWT), course management, and AI-powered recommendations. Design and implement efficient data storage solutions using MongoDB. Leverage Elasticsearch for advanced full-text search capabilities Optimize application performance through Redis caching for frequently accessed data. Build responsive and intuitive user interfaces using React.js, integrating seamlessly with backend APIs. Demonstrate strong understanding and practical application of state management (e.g., Redux, Zustand, React Context API) and client-side caching. Implement and manage CI/CD pipelines (e.g., GitHub Actions, Jenkins) for automated builds, tests, and deployments. Containerize applications using Docker and understand deployment strategies on Linux environments (e.g., PM2, Nginx). Propose and implement solutions for inter-service communication and data streaming using technologies like Apache Kafka. Contribute to the entire software development lifecycle, including conceptualizing, designing, developing, testing, and deploying. Write clean, well-documented, and maintainable code with a focus on best practices and error handling. Qualifications: Minimum 5 years experience as a MERN Stack Developer Strong proficiency in JavaScript/TypeScript, Next.js, Node.js, Express.js, React.js, and MongoDB. Experience with microservices architecture and API design. Hands-on experience with Elasticsearch and Redis. Familiarity with AI integration, specifically with services like Gemini AI. Proficiency in setting up and managing CI/CD pipelines. Experience with Docker for containerization. Understanding of Linux server deployment considerations. Conceptual understanding and practical application of message brokers like Apache Kafka. Solid understanding of state management libraries (e.g., Redux) and client-side caching strategies. Excellent problem-solving skills and attention to detail. Ability to work independently and as part of a collaborative team. Working Days and Office Hours: Our Office Timing is Monday to Saturday 10Am to 6:30PM Why Join Us: If you are obsessed with building secure, scaleable and robust backend solutions with keen eye for frontend design, UI and world class user experience then we are excited to hear from you. We offer fast pace work environment, and amazing career growth opportunities for passionate and dedicated candidates. Follow us to Keep Updated with Current and Upcoming Jobs: - https://www.linkedin.com/company/helpstudyabroad-dot-com/ - https://www.instagram.com/helpstudyabroad.co/ - https://www.youtube.com/@HelpStudyAbroad Job Types: Full-time, Permanent Pay: ₹540,000.00 - ₹720,000.00 per year Benefits: Health insurance Paid time off Work Location: In person

Posted 2 days ago

Apply

3.0 years

5 - 7 Lacs

Noida

On-site

Senior Java Developer (3-4 Years Experience)- Applicant must be from Delhi/NCR only. Advanced Technical Requirements Core Technologies (Must Have) Java : 3-4 years with Java 11+ (Java 21 LTS preferred) Spring Ecosystem : Advanced Spring Boot, Spring Cloud, Spring Security Microservices : Service discovery, API Gateway, distributed systems Database : Advanced PostgreSQL, query optimization, indexing strategies Message Queues : Apache Kafka, event-driven architecture Caching : Redis cluster, distributed caching patterns Spring Cloud Stack Eureka : Service discovery and registration Spring Cloud Gateway : Routing, filtering, load balancing Config Server : Centralized configuration management Circuit Breaker : Resilience4j for fault tolerance Sleuth + Zipkin : Distributed tracing Job Type: Full-time Pay: ₹45,000.00 - ₹60,000.00 per month Work Location: In person Speak with the employer +91 8800602148

Posted 2 days ago

Apply

3.0 - 8.0 years

0 Lacs

Indore, Madhya Pradesh, India

On-site

Job Summary: We are seeking a skilled and experienced Q&A Engineer with a strong technical background in networking, automation, API testing, and performance testing. The ideal candidate will have proficiency in Postman API testing, Java programming, and testing frameworks like JMeter, Selenium, REST Assured, and Robot Framework. The candidate familiar with network architecture, including ORAN, SMO, RIC, and OSS/BSS is Plus. Key Responsibilities: Perform functional, performance, and load testing of web applications using tools such as JMeter and Postman. Develop, maintain, and execute automated test scripts using Selenium with Java for web application testing. Design and implement tests for RESTful APIs using REST Assured (Java library) for testing HTTP responses and ensuring proper API functionality. Collaborate with development teams to identify and resolve software defects through effective debugging and testing. Utilize the Robot Framework with Python for acceptance testing and acceptance test-driven development. Conduct end-to-end testing and ensure that systems meet all functional requirements. Ensure quality and compliance of software releases by conducting thorough test cases and evaluating product quality. Required Skill set: Postman API Testing: Experience in testing RESTful APIs and web services using Postman. Experience Range 3 to 8 years Java: Strong knowledge of Java for test script development, particularly with Selenium and REST Assured. JMeter: Experience in performance, functional, and load testing using Apache JMeter. Selenium with Java: Expertise in Selenium WebDriver for automated functional testing, including script development and maintenance using Java. REST Assured: Proficient in using the REST Assured framework (Java library) for testing REST APIs and validating HTTP responses. Robot Framework: Hands-on experience with the Robot Framework for acceptance testing and test-driven development (TDD) in Python. ORAN/SMO/RIC/OSS Architecture: In-depth knowledge of ORAN (Open Radio Access Network), SMO (Service Management Orchestration), RIC (RAN Intelligent Controller), and OSS (Operations Support Systems) architectures. Good to have Skill Set: Networking Knowledge: Deep understanding of networking concepts, specifically around RAN elements and network architectures (ORAN, SMO, RIC, OSS). Monitoring Tools : Experience with Prometheus, Grafana, and Kafka for real-time monitoring and performance tracking of applications and systems. Keycloak: Familiarity with Keycloak for identity and access management.

Posted 2 days ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

LiveRamp is the data collaboration platform of choice for the world’s most innovative companies. A groundbreaking leader in consumer privacy, data ethics, and foundational identity, LiveRamp is setting the new standard for building a connected customer view with unmatched clarity and context while protecting precious brand and consumer trust. LiveRamp offers complete flexibility to collaborate wherever data lives to support the widest range of data collaboration use cases—within organizations, between brands, and across its premier global network of top-quality partners. Hundreds of global innovators, from iconic consumer brands and tech giants to banks, retailers, and healthcare leaders turn to LiveRamp to build enduring brand and business value by deepening customer engagement and loyalty, activating new partnerships, and maximizing the value of their first-party data while staying on the forefront of rapidly evolving compliance and privacy requirements. LiveRamp is looking for a Staff Backend Engineer to join our team and help build the Unified Segment Builder (USB) — the next-generation, comprehensive segmentation solution for creating precise, real-time, and meaningful audiences. USB is a foundational pillar in LiveRamp’s product ecosystem. It empowers customers to create powerful audience segments using 1st, 2nd, and 3rd-party data, with support for combining, excluding, and overlapping datasets. The solution is designed for scale, performance, and usability — replacing legacy segmentation tools and delivering a unified, world-class user experience. We are also rolling out AI-powered segment building capabilities based on USB, aiming to boost efficiency and expand the use cases beyond traditional campaign planners. You Will Collaborate with APAC engineers, and partner closely with US-based product and UX teams. Design and implement scalable backend systems, APIs, and infrastructure powering the USB and other core LiveRamp products. Lead cross-functional technical discussions, drive architectural decisions, and evangelize engineering best practices across teams. Mentor engineers and contribute to the technical leadership of the local team. Ensure operational excellence by building reliable, observable, and maintainable production systems. Help rearchitect our existing systems to provide a more powerful and flexible data processing environment at scale. Your Team Will Design, build, and scale USB and related segment-building products critical to LiveRamp’s success. Collaborate with engineering, product, DevOps, SRE, and QA teams to deliver new features and improvements. Build systems that integrate with the broader LiveRamp Data Collaboration Platform. Continuously improve quality, performance, and developer experience for internal tools and services About You 8+ years of experience writing and deploying production-grade backend code. Strong programming skills in Java, Python, kotlin, or Go. 3+ years of experience working with big data technologies such as Apache Spark, Hadoop/MapReduce, and Kafka. Extensive experience with containerization and orchestration technologies, including Docker and Kubernetes, for building and managing scalable, reliable services Proven experience designing and delivering large-scale distributed systems in production environments. Strong track record of contributing to or leading architectural efforts for complex systems. Hands-on experience with cloud platforms, ideally GCP (AWS or Azure also acceptable). Proficiency with Spring Boot and modern backend frameworks. Experience working with distributed databases (e.g., SingleStore, ClickHouse, etc.). Bonus Points Familiarity with building AI-enabled applications, especially those involving LLMs or generative AI workflows. Experience with LangChain or LangGraph frameworks for orchestrating multi-step AI agents is a strong plus. Benefits Flexible paid time off, paid holidays, options for working from home, and paid parental leave. Comprehensive Benefits Package: LiveRamp offers a comprehensive benefits package designed to help you be your best self in your personal and professional lives. Our benefits package offers medical, dental, vision, accident, life and disability, an employee assistance program, voluntary benefits as well as perks programs for your healthy lifestyle, career growth, and more. Your medical benefits extend to your dependents including parents. More About Us LiveRamp’s mission is to connect data in ways that matter, and doing so starts with our people. We know that inspired teams enlist people from a blend of backgrounds and experiences. And we know that individuals do their best when they not only bring their full selves to work but feel like they truly belong. Connecting LiveRampers to new ideas and one another is one of our guiding principles—one that informs how we hire, train, and grow our global team across nine countries and four continents. Click here to learn more about Diversity, Inclusion, & Belonging (DIB) at LiveRamp. To all recruitment agencies : LiveRamp does not accept agency resumes. Please do not forward resumes to our jobs alias, LiveRamp employees or any other company location. LiveRamp is not responsible for any fees related to unsolicited resumes.

Posted 2 days ago

Apply

1.0 years

1 - 3 Lacs

India

On-site

About the Role Technobot System is looking for a passionate Full Stack Developer who is either a fresher or has up to 1 year of experience . If you’re someone who’s eager to grow, explore both front-end and back-end development, and work with modern technologies in a fast-paced IT environment — we want to hear from you! Key Responsibilities Design and develop responsive web applications Develop front-end interfaces using HTML, CSS, JavaScript, and modern frameworks like React.js, Angular, or Vue.js Implement back-end logic using Node.js Work with databases like Firebase, MySQL or MongoDB Handle WordPress back-end development using MySQL and WP Admin tools Collaborate with UI/UX designers and other team members Optimize application performance, security, and scalability Work on REST APIs and third-party integrations Deploy applications on servers (Apache/Nginx) and use version control (Git) Follow DevOps practices including CI/CD and cloud services (AWS, GCP, Azure) Test, troubleshoot, and debug code to ensure high performance Requirements Completed graduation in BCA / MCA / B.E. / B.Tech or other IT-related courses Strong understanding of HTML, CSS, JavaScript Familiarity with React.js and Node.js is mandatory Knowledge of WordPress and MySQL setup is a plus Exposure to Git , APIs , and cloud platforms is beneficial Strong problem-solving and communication skills Eagerness to learn and adapt to new technologies Who Can Apply? Freshers or 0–1 year experienced candidates Must have basic knowledge of both front-end and back-end development Candidates from IT-related educational backgrounds How to Apply? Apply directly through Indeed or send your resume to: hr@technobotsystem.com +91 97145 27826 Job Types: Full-time, Permanent, Fresher Pay: ₹15,000.00 - ₹30,000.00 per month Benefits: Flexible schedule Schedule: Day shift Work Location: In person

Posted 2 days ago

Apply

7.0 years

0 Lacs

India

Remote

Role: Neo4j Engineer Overall IT Experience: 7+ years Relevant experience: (Graph Databases: 4+ years, Neo4j: 2+ years) Location: Remote Company Description Bluetick Consultants is a technology-driven firm that supports hiring remote developers, building technology products, and enabling end-to-end digital transformation. With previous experience in top technology companies such as Amazon, Microsoft, and Craftsvilla, we understand the needs of our clients and provide customized solutions. Our team has expertise in emerging technologies, backend and frontend development, cloud development, and mobile technologies. We prioritize staying up-to-date with the latest technological advances to create a long-term impact and grow together with our clients. Key Responsibilities • Graph Database Architecture: Design and implement Neo4j graph database schemas optimized for fund administration data relationships and AI-powered queries • Knowledge Graph Development: Build comprehensive knowledge graphs connecting entities like funds, investors, companies, transactions, legal documents, and market data • Graph-AI Integration: Integrate Neo4j with AI/ML pipelines, particularly for enhanced RAG (Retrieval-Augmented Generation) systems and semantic search capabilities • Complex Relationship Modeling: Model intricate relationships between Limited Partners, General Partners, fund structures, investment flows, and regulatory requirements • Query Optimization: Develop high-performance Cypher queries for real-time analytics, relationship discovery, and pattern recognition • Data Pipeline Integration: Build ETL processes to populate and maintain graph databases from various data sources including FundPanel.io, legal documents, and external market data using domain specific ontologies • Graph Analytics: Implement graph algorithms for fraud detection, risk assessment, relationship scoring, and investment opportunity identification • Performance Tuning: Optimize graph database performance for concurrent users and complex analytical queries • Documentation & Standards: Establish graph modelling standards, query optimization guidelines, and comprehensive technical documentation Key Use Cases You'll Enable • Semantic Search Enhancement: Create knowledge graphs that improve AI search accuracy by understanding entity relationships and context • Investment Network Analysis: Map complex relationships between investors, funds, portfolio companies, and market segments • Compliance Graph Modelling: Model regulatory relationships and fund terms to support automated auditing and compliance validation • Customer Relationship Intelligence: Build relationship graphs for customer relations monitoring and expansion opportunity identification • Predictive Modelling Support: Provide graph-based features for investment prediction and risk assessment models • Document Relationship Mapping: Connect legal documents, contracts, and agreements through entity and relationship extraction Required Qualifications • Bachelor's degree in Computer Science, Data Engineering, or related field • 7+ years of overall IT Experience • 4+ years of experience with graph databases, with 2+ years specifically in Neo4j • Strong background in data modelling, particularly for complex relationship structures • Experience with financial services data and regulatory requirements preferred • Proven experience integrating graph databases with AI/ML systems • Understanding of knowledge graph concepts and semantic technologies • Experience with high-volume, production-scale graph database implementations Technology Skills • Graph Databases: Neo4j (primary), Cypher query language, APOC procedures, Neo4j Graph Data Science library • Programming: Python, Java, or Scala for graph data processing and integration • AI Integration: Experience with graph-enhanced RAG systems, vector embeddings in graph context, GraphRAG implementations • Data Processing: ETL pipelines, data transformation, real-time data streaming (Kafka, Apache Spark) • Cloud Platforms: Neo4j Aura, Azure integration, containerized deployments • APIs: Neo4j drivers, REST APIs, GraphQL integration • Analytics: Graph algorithms (PageRank, community detection, shortest path, centrality measures) • Monitoring: Neo4j monitoring tools, performance profiling, query optimization • Integration: Elasticsearch integration, vector database connections, multi-modal data handling Specific Technical Requirements • Knowledge Graph Construction: Entity resolution, relationship extraction, ontology modelling • Cypher Expertise: Advanced Cypher queries, stored procedures, custom functions • Scalability: Clustering, sharding, horizontal scaling strategies • Security: Graph-level security, role-based access control, data encryption • Version Control: Graph schema versioning, migration strategies • Backup & Recovery: Graph database backup strategies, disaster recovery planning Industry Context Understanding • Fund Administration: Understanding of fund structures, capital calls, distributions, and investor relationships • Financial Compliance: Knowledge of regulatory requirements and audit trails in financial services • Investment Workflows: Understanding of due diligence processes, portfolio management, and investor reporting • Legal Document Structures: Familiarity with LPA documents, subscription agreements, and fund formation documents Collaboration Requirements • AI/ML Team: Work closely with GenAI engineers to optimize graph-based AI applications • Data Architecture Team: Collaborate on overall data architecture and integration strategies • Backend Developers: Integrate graph databases with application APIs and microservices • DevOps Team: Ensure proper deployment, monitoring, and maintenance of graph database infrastructure • Business Stakeholders: Translate business requirements into effective graph models and queries Performance Expectations • Query Performance: Ensure sub-second response times for standard relationship queries • Scalability: Support 100k+ users with concurrent access to graph data • Accuracy: Maintain data consistency and relationship integrity across complex fund structures • Availability: Ensure 99.9% uptime for critical graph database services • Integration Efficiency: Seamless integration with existing FundPanel.io systems and new AI services This role offers the opportunity to work at the intersection of advanced graph technology and artificial intelligence, creating innovative solutions that will transform how fund administrators understand and leverage their data relationships.

Posted 2 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies