Jobs
Interviews

10691 Apache Jobs - Page 10

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 years

18 Lacs

Calcutta

On-site

Job Title: Software Developer Location: Sector 5, Salt Lake, Kolkata Shift Timings: Flexible Day Shift or Afternoon shift Week Offs: Saturday and Sunday Employment Type: Full Time On-Site or Hybrid Industry : Telecommunication, IT and Security Employment Type: Full Time Onsite Salary: Upto 18 LPA Who We Are: Salescom Services Private Limited is a one hundred percent subsidiary of a British Technology business. We provide IT, security and Telecommunication products and services to Enterprise and SMEs. We as an organization value people who bring forth a combination of Talent, proactiveness and a never say never attitude! We enable you with the right kind of knowledge and skills that will help you develop into a productive and outstanding professional. Our expertise lies in 360-degree project management, customer success, revenue assurance, account management, billing & analytics, quality and compliance, web security and IT Helpdesk in the space of technology and telecommunications. We are backed by a combined experience of over two decades that the board members have in this space, operating successful ventures, and acquisitions over the years. The founding members of Salescom have operated in Australia and the United Kingdom, running successful, and widely known technology and telecommunication ventures, and in Dec-2019, decided to launch its first captive unit in the heart of the IT workforce space, - Sector V - Kolkata, West Bengal. Job Overview: We are looking for an experienced Software Developer specializing in ASP.NET to build software using languages and technologies of the .NET framework. You should be a pro with third-party API integrations and user application programming journeys. In this role, you should be able to write smooth & functional code with a sharp eye for spotting defects. You should be a team player and an excellent communicator. If you are also passionate about the .NET framework and software design/architecture, we’d like to meet you. Your goal will be to work with internal teams to design, develop and maintain functional software of all kinds. Key Responsibilities: Desing and develop web application: Build robust and scalable web-based solutions using ASP.NET and C#. Optimize database interactions using SQL and NoSQL technologies like Microsoft SQL Server, PostgreSQL, and SQLite. Front End Implementation: Develop interactive user interfaces using modern frameworks (Blazor, React). Implement responsive design using Bootstrap, HTML, CSS, JavaScript, and jQuery. API Integration & Management: Integrate and maintain third-party SOAP and RESTful APIs. Ensure secure and efficient data exchanges across external systems. Testing & Quality Assurance: Use tools such as Jenkins to automate testing processes. Write and maintain unit and integration tests for consistent performance. Troubleshooting & Optimization: Identify and resolve software bugs and performance bottlenecks. Analyse prototype feedback and iterate quickly to improve solutions. Collaboration & Communication: Work closely with cross-functional teams to understand requirements. Document development progress and articulate technical solutions effectively. Continuous Improvement: Stay up to date with emerging technologies and coding practices. Contribute to code reviews and mentor junior developers. Pre Requisites: Required at least 4 years of Software development using ASP.NET, C#, SQL/ NoSQL (Microsoft SQL, PostgreSQL, SQlite etc) Experience with Modern Front-End Frameworks (Blazor, React etc) Hands on experience in Third Party SOAP and Rest API integrations. Experienced in Bootstrap, jQuery, HTML, CSS and JavaScript. Knowledge of standard unit testing tools such as Jenkins. Excellent troubleshooting skills in software prototypes. Excellent verbal and written communication skills. BSc / B Tech/ BCA in Computer Science, Engineering, or a related field Good to have skill set: Knowledge of .NET MVC Knowledge of .NET MAUI (Xamarin) Experience with CRM development Experience working in ISP, Telephony and MSP Experience with Apache HTTP & Nginx Experience with Debian & Debian based Linux Server Distributions (Eg – Ubuntu) What's in it for you: Competitive salary, periodic reviews and performance-based bonuses. Comprehensive health insurance coverage for self and chosen family defendants. Professional development opportunities, including training and company funded certifications Collaborative and inclusive work environment that values diversity and creativity Café facilities Free drop services back home Businesses We Own & Operate: https://v4consumer.co.uk https://v4one.co.uk How to Apply: Interested candidates are invited to submit their resume and cover letter to hr@salescom.in in confidence. Please label “Senior Software Developer Application” in the email subject line. All candidates will be treated equally, and we will base decisions on appointments on the merits of the candidates. We welcome applications from all candidates, regardless of any protected characteristic and are an equal opportunity employer

Posted 2 days ago

Apply

7.0 years

0 Lacs

Calcutta

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Manager Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Responsibilities: Job Description: · Analyses current business practices, processes, and procedures as well as identifying future business opportunities for leveraging Microsoft Azure Data & Analytics Services. · Provide technical leadership and thought leadership as a senior member of the Analytics Practice in areas such as data access & ingestion, data processing, data integration, data modeling, database design & implementation, data visualization, and advanced analytics. · Engage and collaborate with customers to understand business requirements/use cases and translate them into detailed technical specifications. · Develop best practices including reusable code, libraries, patterns, and consumable frameworks for cloud-based data warehousing and ETL. · Maintain best practice standards for the development or cloud-based data warehouse solutioning including naming standards. · Designing and implementing highly performant data pipelines from multiple sources using Apache Spark and/or Azure Databricks · Integrating the end-to-end data pipeline to take data from source systems to target data repositories ensuring the quality and consistency of data is always maintained · Working with other members of the project team to support delivery of additional project components (API interfaces) · Evaluating the performance and applicability of multiple tools against customer requirements · Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints. · Integrate Databricks with other technologies (Ingestion tools, Visualization tools). · Proven experience working as a data engineer · Highly proficient in using the spark framework (python and/or Scala) · Extensive knowledge of Data Warehousing concepts, strategies, methodologies. · Direct experience of building data pipelines using Azure Data Factory and Apache Spark (preferably in Databricks). · Hands on experience designing and delivering solutions using Azure including Azure Storage, Azure SQL Data Warehouse, Azure Data Lake, Azure Cosmos DB, Azure Stream Analytics · Experience in designing and hands-on development in cloud-based analytics solutions. · Expert level understanding on Azure Data Factory, Azure Synapse, Azure SQL, Azure Data Lake, and Azure App Service is required. · Designing and building of data pipelines using API ingestion and Streaming ingestion methods. · Knowledge of Dev-Ops processes (including CI/CD) and Infrastructure as code is essential. · Thorough understanding of Azure Cloud Infrastructure offerings. · Strong experience in common data warehouse modelling principles including Kimball. · Working knowledge of Python is desirable · Experience developing security models. · Databricks & Azure Big Data Architecture Certification would be plus · Must be team oriented with strong collaboration, prioritization, and adaptability skills required Mandatory skill sets: Azure Databricks Preferred skill sets: Azure Databricks Years of experience required: 7-10 Years Education qualification: BE, B.Tech, MCA, M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Technology, Bachelor of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Databricks Platform Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Coaching and Feedback, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling {+ 32 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date

Posted 2 days ago

Apply

8.0 years

0 Lacs

India

Remote

Job Description: Job Title: Senior Java Backend Developer Shift: EST (3:00PM to 12:30AM) Location: Remote Duration: Contract Job Description: We are seeking a highly experienced Senior Java Backend Developer to join our remote team. The ideal candidate will have strong expertise in building scalable microservices using modern Java technologies, Spring frameworks, and cloud-native practices. This role requires a deep understanding of enterprise-grade application development, CI/CD, and DevOps collaboration. Key Responsibilities: Design, develop, and maintain microservices using Java 17+ , Spring Boot , and Spring Cloud . Develop RESTful APIs and implement asynchronous messaging with Apache Kafka or RabbitMQ . Integrate databases such as PostgreSQL and SQL Server using Spring Data JPA , Hibernate , and Flyway . Implement secure authentication and authorization using Spring Security , OAuth2 , and Okta . Optimize performance using Redis for caching and session management. Deploy and manage services in Azure Kubernetes Service (AKS) using Docker and Helm . Ensure application observability with ELK Stack (Elasticsearch, Logstash, Kibana) , Prometheus , and Grafana . Write and maintain unit, integration, and contract tests using JUnit 5 , Mockito , Testcontainers , and Spring Test . Follow best practices in CI/CD using tools like GitHub Actions , Azure DevOps , or Jenkins . Participate in code reviews, architectural discussions, and performance optimizations. Collaborate with DevOps and Site Reliability Engineering (SRE) teams to ensure system reliability. Document APIs using OpenAPI/Swagger and maintain technical documentation. Required Skills: 8+ years of backend development experience with Java (Java 11 or newer preferred). Strong hands-on experience with Spring Boot , Spring Cloud , Spring Security , Spring Data JPA , and Hibernate . Expertise in REST APIs , OAuth2 , JWT , and Okta integration. Experience with PostgreSQL , SQL Server , Redis , and Flyway . Familiarity with messaging systems like Kafka or RabbitMQ . Hands-on experience in Docker , Kubernetes , and Helm , ideally in Azure Cloud . Proficient with monitoring and observability tools including ELK , Prometheus , and Grafana . Strong unit and integration testing skills with JUnit 5 , Mockito , and contract testing . Solid experience with CI/CD pipelines and source control using Git . If you are interested, please share your resume at sadiya.mankar@leanitcorp.com

Posted 2 days ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

On-site

We are looking to hire an immediate joiner for Backend Developer & Sr. Backend Developer ( MongoDB, Node.js ) for a fast-growing Tech Logistics startup that is creating the future of the domain by leveraging deep industry insights and forecasting the forces of disruption that will shape the customers’ needs. You will be working as Backend Developer ( MongoDB, Node.js ) focused on building the next generation logistics solutions, working on new-age technologies, and supporting customers across multiple geographies. Responsibilities ● Develop enterprise-level software solutions according to technical specifications. ● Design and implement server-side applications using Node.js, Express, and MongoDB. ● Optimize server-side code for maximum performance and scalability. ● Identify and troubleshoot performance and security issues in the backend infrastructure. ● Implement data storage solutions using MongoDB and ensure proper database integration. ● Analyse the work and create tasks & sub-tasks with time estimations. ● Writing the unit tests to meet code coverage. ● Create, plan, and manage Tasks in JIRA ● Plan to manage & control code quality and execute code review. Criteria of the Role ● Experience developing/Architecting backend applications Node.js and MongoDB. ● 3 to 5+ years of work experience ● Ability to effectively manage time and prioritize work. ● Proficiency in GitOps ● Knowledge in building and securing REST APIs are compulsory. ● Knowledge in GraphQL and Redis is a plus. ● Knowledge of web servers like Apache and NGINX is a plus. ● Knowledge of CI/CD configuration is a plus. ● Strong oral and written communication skills, including technical documentation. Competency ● Should be a University Graduate ● Good Communication skills. ● Technical skills ● Strong oral and written communication skills, including technical documentation. ● Problem-solving and Analytical skills ● Excellent at clear and concise written and verbal communication. Basic Qualification BE/B-TECH/MCA/BCA or bachelor’s degree in a related subject

Posted 2 days ago

Apply

4.0 years

4 - 9 Lacs

Indore

On-site

Become a part of Belgium Webnet where work and fun go hand in hand. Belgium Webnet is looking for a Full Stack Developer to join our Indore office, who is Exceptionally good full-stack product developer with experience on MVC, CI, Angular, React JS, Node JS with MongoDB & MySQL database, we are providing technical services to our USA clients by generating powerful & Flexible E-Commerce Website Platform from last 4 years. We are also engaged in Wholesale Diamond Trading Since 1998, we are located in the heart of New York City’s famed Diamond District on 47th Street & India’s cleanest City Indore, Madhya Pradesh. Job Location: Indore Salary: As per Company Standards (Between Rs. 4,00,000- 9,00,000 p.a.) Experience: Minimum 2 to 6 years of experience required Job Summary Knowledge of PHP, CI, HTML/CSS and JQuery & JavaScript Frameworks such as Angular.JS, Node.JS, Opencart, Ajax. Good proficiency on Database modeling &design (e.g. MySQL, SQL is must and Mongo DB will be an added advantage), web servers (e.g. Apache) and UI/UX design. Should have an expertise in developing E-Commerce Website. Data migration, transformation, and scripting. Integrate data from various back-end services and databases Expertise in developing REST APIs with any back-end framework. Exposure to AWS services like S3, Cloudfront, Cloudwatch, lambda & API gateway. Familiarity with the whole web stack, including protocols and web server optimization techniques. Highly motivated with experience with Java/NodeJS/Python based microservices / backend and AngularJS based frontends. Expertise in handling payment systems, especially payment gateway integrations with Paypal, Stripe / Braintree, is a plus. Server-side languages like PHP, Python, Ruby, Java, JavaScript, and .Net Good Understanding of MVC design patterns and frameworks Proficient in Web services REST / SOAP / XML. Experience of third-party APIs like Google, Facebook, Twitter Strong debugging skills and ability to understand and work on existing code Understanding client requirements & functional specifications. Good with Logical Problem Solving. Should have good written/verbal communication skills in English. Great problem-solving skills and ability to abstract functional requirements Skills Required- Ecommerce, Website Development, PHP, CI, Angular, Node, React, HTML, CSS, J Query, Javascript, MYSQL, MongoDB, AWS and Google Cloud Platform, Team Lead, Full Stack developer. Job Type: Full Time Job Location: Indore Experience: 2 to 6 Years

Posted 2 days ago

Apply

2.0 years

3 - 10 Lacs

India

Remote

Job Title - Sr. Data Engineer Experience - 2+ Years Location - Indpre (onsite) Industry - IT Job Type - Full ime Roles and Responsibilities- 1. Design and develop scalable data pipelines and workflows for data ingestion, transformation, and integration. 2. Build and maintain data storage systems, including data warehouses, data lakes, and relational databases. 3. Ensure data accuracy, integrity, and consistency through validation and quality assurance processes. 4. Collaborate with data scientists, analysts, and business teams to understand data needs and deliver tailored solutions. 5. Optimize database performance and manage large-scale datasets for efficient processing. 6. Leverage cloud platforms (AWS, Azure, or GCP) and big data technologies (Hadoop, Spark, Kafka) for building robust data solutions. 7. Automate and monitor data workflows using orchestration frameworks such as Apache Airflow. 8. Implement and enforce data governance policies to ensure compliance and data security. 9. Troubleshoot and resolve data-related issues to maintain seamless operations. 10. Stay updated on emerging tools, technologies, and trends in data engineering. Skills and Knowledge- 1. Core Skills: ● Proficient in Python (libraries: Pandas, NumPy) and SQL. ● Knowledge of data modeling techniques, including: ○ Entity-Relationship (ER) Diagrams ○ Dimensional Modeling ○ Data Normalization ● Familiarity with ETL processes and tools like: ○ Azure Data Factory (ADF) ○ SSIS (SQL Server Integration Services) 2. Cloud Expertise: ● AWS Services: Glue, Redshift, Lambda, EKS, RDS, Athena ● Azure Services: Databricks, Key Vault, ADLS Gen2, ADF, Azure SQL ● Snowflake 3. Big Data and Workflow Automation: ● Hands-on experience with big data technologies like Hadoop, Spark, and Kafka. ● Experience with workflow automation tools like Apache Airflow (or similar). Qualifications and Requirements- ● Education: ○ Bachelor’s degree (or equivalent) in Computer Science, Information Technology, Engineering, or a related field. ● Experience: ○ Freshers with strong understanding, internships and relevant academic projects are welcome. ○ 2+ years of experience working with Python, SQL, and data integration or visualization tools is preferred. ● Other Skills: ○ Strong communication skills, especially the ability to explain technical concepts to non-technical stakeholders. ○ Ability to work in a dynamic, research-oriented team with concurrent projects. Job Types: Full-time, Permanent Pay: ₹300,000.00 - ₹1,000,000.00 per year Benefits: Paid sick time Provident Fund Work from home Schedule: Day shift Monday to Friday Weekend availability Supplemental Pay: Performance bonus Ability to commute/relocate: Niranjanpur, Indore, Madhya Pradesh: Reliably commute or planning to relocate before starting work (Preferred) Experience: Data Engineer: 2 years (Preferred) Work Location: In person Application Deadline: 31/08/2025

Posted 2 days ago

Apply

0.0 - 2.0 years

1 - 5 Lacs

India

Remote

Job Title : Jr. Data Engineer Location : Indore (Onsite) Experience : 0–2 Years Industry : Information Technology Employment Type : Full-time Job Summary : We are looking for a motivated and detail-oriented Junior Data Engineer to join our team onsite in Indore. The ideal candidate should have a solid understanding of Python and SQL, with a passion for data processing, transformation, and analytics. Strong communication skills, confidence, and the ability to learn quickly are key for success in this role. Key Responsibilities : Assist in designing, developing, and maintaining ETL pipelines and data workflows. Work with structured and unstructured data using Python and SQL . Support data collection, cleansing, transformation, and validation activities. Collaborate with data scientists, analysts, and software engineers to support data needs. Troubleshoot data-related issues and ensure high data quality and integrity. Create and maintain documentation for data pipelines and workflows. Continuously improve data engineering processes and performance. Key Requirements : 0–2 years of experience in a Data Engineering or related role. Good knowledge of Python and SQL is a must. Familiarity with databases like MySQL, PostgreSQL, or SQL Server . Understanding of data structures, algorithms, and basic ETL concepts. Strong analytical, problem-solving , and communication skills . Ability to work independently and collaboratively in a fast-paced environment. Self-motivated, confident, and eager to learn new technologies. Nice to Have : Exposure to cloud platforms like AWS, Azure, or GCP . Experience with data visualization tools like Power BI, Tableau , or Excel dashboards . Basic understanding of data warehousing , big data , or streaming technologies . Familiarity with tools like Airflow , Apache Spark , or Pandas . Perks & Benefits : Competitive salary with growth opportunities. Mentorship from experienced data professionals. Hands-on experience in real-world projects. Onsite work in a collaborative office environment. Performance-based incentives and learning support. Job Types: Full-time, Permanent Pay: ₹180,000.00 - ₹500,000.00 per year Benefits: Paid sick time Provident Fund Work from home Schedule: Day shift Monday to Friday Weekend availability Supplemental Pay: Performance bonus Ability to commute/relocate: Niranjanpur, Indore, Madhya Pradesh: Reliably commute or planning to relocate before starting work (Preferred) Experience: Data Engineer: 1 year (Preferred) Work Location: In person Application Deadline: 30/08/2025

Posted 2 days ago

Apply

2.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Job Description: L1 Support – Data Engineering (Full Time WFO) Location: Noida Work Mode: Noida Office | 6 days/week | 24x7x365 support (rotational shifts) Salary Range - Between INR 2.5 to 3 Lacs Per Annum Experience: 2 years Language: English proficiency mandatory About the Role We're looking for an experienced and motivated L1 Support Engineer – Data Engineering to join our growing team. If you have solid exposure to AWS , SQL , and Python scripting , and you're ready to thrive in a 24x7 support environment—this role is for you! What You’ll Do Monitor and support AWS services (S3, EC2, CloudWatch, IAM) Handle SQL-based issue resolution and data analysis Run and maintain Python scripts ; Shell scripting is a plus Support ETL pipelines and data workflows Monitor Apache Airflow DAGs and resolve basic issues Collaborate with cross-functional and multicultural teams What We’re Looking For B.Tech or MCA preferred , but candidates with a Bachelor’s degree in any field and the right skillset are welcome to apply. 2 years of Data Engineering Support or similar experience Strong skills in AWS , SQL , Python , and ETL processes Familiarity with data warehousing (Amazon Redshift or similar) Ability to work rotational shifts in a 6-day, 24x7 environment Excellent communication and problem-solving skills English fluency is required

Posted 2 days ago

Apply

0.0 years

0 Lacs

Goregaon, Maharashtra, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Risk Management Level Associate Job Description & Summary A career within Internal Audit services, will provide you with an opportunity to gain an understanding of an organisation’s objectives, regulatory and risk management environment, and the diverse needs of their critical stakeholders. We focus on helping organisations look deeper and see further considering areas like culture and behaviours to help improve and embed controls. In short, we seek to address the right risks and ultimately add value to their organisation. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true saelves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within…. Responsibilities: Architecture Design: · Design and implement scalable, secure, and high-performance architectures for Generative AI applications. · Integrate Generative AI models into existing platforms, ensuring compatibility and performance optimization. Model Development and Deployment: · Fine-tune pre-trained generative models for domain-specific use cases. · Data Collection, Sanitization and Data Preparation strategy for Model fine tuning. · Well versed with machine learning algorithms like Supervised, unsupervised and Reinforcement learnings, Deep learning. · Well versed with ML models like Linear regression, Decision trees, Gradient boosting, Random Forest and K-means etc. · Evaluate, select, and deploy appropriate Generative AI frameworks (e.g., PyTorch, TensorFlow, Crew AI, Autogen, Langraph, Agentic code, Agent flow). Innovation and Strategy: · Stay up to date with the latest advancements in Generative AI and recommend innovative applications to solve complex business problems. · Define and execute the AI strategy roadmap, identifying key opportunities for AI transformation. · Good exposure to Agentic Design patterns Collaboration and Leadership: · Collaborate with cross-functional teams, including data scientists, engineers, and business stakeholders. · Mentor and guide team members on AI/ML best practices and architectural decisions. · Should be able to lead a team of data scientists, GenAI engineers and Software Developers. Performance Optimization: · Monitor the performance of deployed AI models and systems, ensuring robustness and accuracy. · Optimize computational costs and infrastructure utilization for large-scale deployments. Ethical and Responsible AI: · Ensure compliance with ethical AI practices, data privacy regulations, and governance frameworks. · Implement safeguards to mitigate bias, misuse, and unintended consequences of Generative AI. Mandatory skill sets: · Advanced programming skills in Python and fluency in data processing frameworks like Apache Spark. · Experience with machine learning, artificial Intelligence frameworks models and libraries (TensorFlow, PyTorch, Scikit-learn, etc.). · Should have strong knowledge on LLM’s foundational model (OpenAI GPT4o, O1, Claude, Gemini etc), while need to have strong knowledge on opensource Model’s like Llama 3.2, Phi etc. · Proven track record with event-driven architectures and real-time data processing systems. · Familiarity with Azure DevOps and other LLMOps tools for operationalizing AI workflows. · Deep experience with Azure OpenAI Service and vector DBs, including API integrations, prompt engineering, and model fine-tuning. Or equivalent tech in AWS/GCP. · Knowledge of containerization technologies such as Kubernetes and Docker. · Comprehensive understanding of data lakes and strategies for data management. · Expertise in LLM frameworks including Langchain, Llama Index, and Semantic Kernel. · Proficiency in cloud computing platforms such as Azure or AWS. · Exceptional leadership, problem-solving, and analytical abilities. · Superior communication and collaboration skills, with experience managing high-performing teams. · Ability to operate effectively in a dynamic, fast-paced environment. Preferred skill sets: · Experience with additional technologies such as Datadog, and Splunk. · Programming languages like C#, R, Scala · Possession of relevant solution architecture certificates and continuous professional development in data engineering and Gen AI. Years of experience required: 0-1 Years Education qualification: · BE / B.Tech / MCA / M.Sc / M.E / M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor in Business Administration, Master of Business Administration, Bachelor of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Java Optional Skills Accepting Feedback, Accepting Feedback, Accounting and Financial Reporting Standards, Active Listening, Artificial Intelligence (AI) Platform, Auditing, Auditing Methodologies, Business Process Improvement, Communication, Compliance Auditing, Corporate Governance, Data Analysis and Interpretation, Data Ingestion, Data Modeling, Data Quality, Data Security, Data Transformation, Data Visualization, Emotional Regulation, Empathy, Financial Accounting, Financial Audit, Financial Reporting, Financial Statement Analysis, Generally Accepted Accounting Principles (GAAP) {+ 19 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 2 days ago

Apply

8.0 years

0 Lacs

Andhra Pradesh, India

On-site

Key Responsibilities We are currently seeking a skilled and experienced Java J2EE Developer with a minimum 8 years of hands-on experience Capability to create design solutions independently for a given module Develop and maintain web applications using Java Spring Boot user interfaces using HTML CSS JavaScript Write and maintain unit tests using Junit and Mockito Deploy and manage applications on servers such as JBoss WebLogic Apache and Nginx Ensure application security Familiarity with build tools such as Maven and Gradle Experience with caching technologies like Redis and Coherence Understanding of Spring Security Knowledge of Groovy is a plus Excellent problem-solving skills and attention to detail Strong communication and teamwork abilities Qualifications Bachelors degree in Computer Science Information Technology or a related field 6 8 years of experience in full stack development Proven track record of delivering high quality software solutions with cross functional teams to define design and ship new features Troubleshoot and resolve issues in a timely manner Stay updated with the latest industry trends and technologies Should have knowledge on SQL Required Skills Proficiency in HTML CSS and JavaScript Strong experience with Java and Spring frameworks Spring Boot SQL and with familiarity on CI/CD .

Posted 2 days ago

Apply

3.0 years

0 Lacs

Itanagar, Arunachal Pradesh, India

Remote

Software QA Engineer II - Location: Remote Remote India Remote Remote India Job Type: Regular Full-time Division: Precision AQ Business Unit: Product Solutions Requisition Number: 5895 Position Summary The Quality Assurance (QA) Engineer II is responsible for design and implementation of manual testing solutions for new product development and business unit operations. This position interacts with architects, product owners, and other engineers to ensure software meets quality standards throughout the delivery process. Essential functions of the job include but are not limited to Create and implement testing frameworks for data operations related to our SaaS solutions Collaborate with engineers, architects, and product owners to understand functional requirements and ensure comprehensive test coverage Perform regression testing and validate new features and enhancements Use bug tracking system to document defects and report them to developers Conduct performance testing and monitor system behavior under load Develop and maintain documentation for test plans, test cases, and test results Participate in code reviews and contribute to continuous improvement of testing processes Stay up-to-date with industry trends, particularly in AI and GenAI solutions, to enhance testing practices. Qualifications Education: Bachelor’s degree in Computer Science, Engineering, or related field. Other Required 3+ years of experience in software quality assurance 3+ years of experience in testing SaaS applications and data operations Strong experience with Azure Cloud Services and Azure DevOps Skills Attention to detail, accuracy, and quality Ability to adapt to changing priorities Expertise working in Windows and Unix ecosystems Excellent communication and teamwork abilities Strong analytical and problem-solving skills Preferred Experience with testing AI solutions Azure AI Fundamental Certification Azure Data Fundamentals Certification Expertise with Apache JMeter, Test Harness, Selenium Expertise in python, SQL Server, and vue.js It has come to our attention that some individuals or organizations are reaching out to job seekers and posing as potential employers presenting enticing employment offers. We want to emphasize that these offers are not associated with our company and may be fraudulent in nature. Please note that our organization will not extend a job offer without prior communication with our recruiting team, hiring managers and a formal interview process. Apply Now

Posted 2 days ago

Apply

0.0 - 1.0 years

0 - 0 Lacs

Jagatpura, Jaipur, Rajasthan

On-site

AWS Data Engineer Location: Jaipur Mode: On-site Experience: 2+ Years The Role Zynsera is looking for a talented AWS Data Engineer to join our dynamic team! If you have a strong grasp of AWS services, serverless data pipelines, and Infrastructure as Code — let’s connect. As an AWS Data Engineer at Zynsera, you will: Develop and optimize data pipelines using AWS Glue, Lambda, and Athena Build infrastructure using AWS CDK for automation and scalability Manage structured and semi-structured data with AWS Lakehouse & Iceberg Design serverless architectures for batch and streaming workloads Collaborate with senior engineers to drive performance and innovation You're a Great Fit If You Have: Proficiency in AWS Glue, Lambda, Athena, and Lakehouse architecture Experience with CDK, Python, PySpark, Spark SQL, or Java/Scala Familiarity with data lakes, data warehousing, and scalable cloud solutions (Bonus) Knowledge of Firehose, Kinesis, Apache Iceberg, or DynamoDB Job Types: Full-time, Permanent Pay: ₹25,316.90 - ₹45,796.55 per month Ability to commute/relocate: Jagatpura, Jaipur, Rajasthan: Reliably commute or planning to relocate before starting work (Required) Experience: AWS Data Engineer: 1 year (Required) Work Location: In person

Posted 2 days ago

Apply

10.0 - 12.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

TCS present an excellent opportunity for Data architect Job Description: Skills: AWS, Glue, Redshift, PySpark Location: Pune / Kolkata Experience: 10 to 12 Years Strong hands-on experience in Python programming and PySpark. Experience using AWS services (RedShift, Glue, EMR, S3 & Lambda) Experience working with Apache Spark and Hadoop ecosystem. Experience in writing and optimizing SQL for data manipulations. Good Exposure to scheduling tools. Airflow is preferable. Must – Have Data Warehouse Experience with AWS Redshift or Hive. Experience in implementing security measures for data protection. Expertise to build/test complex data pipelines for ETL processes (batch and near real time) Readable documentation of all the components being developed. Knowledge of Database technologies for OLTP and OLAP workloads.

Posted 2 days ago

Apply

2.0 years

0 Lacs

Tamil Nadu, India

On-site

About BNP Paribas India Solutions Established in 2005, BNP Paribas India Solutions is a wholly owned subsidiary of BNP Paribas SA, European Union’s leading bank with an international reach. With delivery centers located in Bengaluru, Chennai and Mumbai, we are a 24x7 global delivery center. India Solutions services three business lines: Corporate and Institutional Banking, Investment Solutions and Retail Banking for BNP Paribas across the Group. Driving innovation and growth, we are harnessing the potential of over 10000 employees, to provide support and develop best-in-class solutions. About BNP Paribas Group BNP Paribas is the European Union’s leading bank and key player in international banking. It operates in 65 countries and has nearly 185,000 employees, including more than 145,000 in Europe. The Group has key positions in its three main fields of activity: Commercial, Personal Banking & Services for the Group’s commercial & personal banking and several specialised businesses including BNP Paribas Personal Finance and Arval; Investment & Protection Services for savings, investment, and protection solutions; and Corporate & Institutional Banking, focused on corporate and institutional clients. Based on its strong diversified and integrated model, the Group helps all its clients (individuals, community associations, entrepreneurs, SMEs, corporates and institutional clients) to realize their projects through solutions spanning financing, investment, savings and protection insurance. In Europe, BNP Paribas has four domestic markets: Belgium, France, Italy, and Luxembourg. The Group is rolling out its integrated commercial & personal banking model across several Mediterranean countries, Turkey, and Eastern Europe. As a key player in international banking, the Group has leading platforms and business lines in Europe, a strong presence in the Americas as well as a solid and fast-growing business in Asia-Pacific. BNP Paribas has implemented a Corporate Social Responsibility approach in all its activities, enabling it to contribute to the construction of a sustainable future, while ensuring the Group's performance and stability Commitment to Diversity and Inclusion At BNP Paribas, we passionately embrace diversity and are committed to fostering an inclusive workplace where all employees are valued, respected and can bring their authentic selves to work. We prohibit Discrimination and Harassment of any kind and our policies promote equal employment opportunity for all employees and applicants, irrespective of, but not limited to their gender, gender identity, sex, sexual orientation, ethnicity, race, colour, national origin, age, religion, social status, mental or physical disabilities, veteran status etc. As a global Bank, we truly believe that inclusion and diversity of our teams is key to our success in serving our clients and the communities we operate in. About Business Line/Function BNP Paribas IT teams are providing infrastructure, development and production support services to all applications used worldwide by all business lines. There is a great variety of technologies and infrastructures from legacy systems to cutting edge Cloud technologies. Within BNP Paribas Group IT, the filiere “FORTIS” oversees operationally to the challenges of IT applications with an end-to-end vision and consistently across the Bank. Several domains of these filiere contribute to this, including the domain “Service Offering DevOps”, which provides the DevSecOps platform for IT Group, Control Center, DB Activities and move to Cloud project. BNP Paribas Fortis is a bank that is responsible and socially committed. The environment, diversity, cultural support, sponsorship... Through various and concrete ways, we are dedicated to meeting our customers’ expectations and proud to demonstrate our values: responsible, human, innovative and enthusiastic Job Title Axway Infra Engineer Date Department: ITGP Location: Chennai Business Line / Function BNPP Fortis Reports To (Direct) ISPL ITG OPS Grade (if applicable) (Functional) Number Of Direct Reports Directorship / Registration: NA Position Purpose Provide a brief description of the overall purpose of the position, why this position exists and how it will contribute in achieving the team’s goal. As an API Gateway Engineer (B2B), The Application security squad within Agile Production Services Tribe will work together with the existing Web Design and Web Application Firewall squads for the technical design, installation, set-up, industrialization, management, support and documentation of the BNP Paribas Fortis Web Security Infrastructure You will develop and maintain API’s for multiple customer centres within an efficient Agile SDLC for the API Management platform. You will work cross-functionally with Architects, Engineers, and Business Analysts, across multiple teams. Responsibilities Direct Responsibilities At least 2 years of experience in support of Axway API Gateway systems Design, deliver and support the integration layer between operating systems and business applications within distributed environments for B2B, B2C and Web information exchange solutions. Focuses on the integration of web application in the Web Infrastructure for the intranet as well as for the DMZ. Assist and active participation in production support (incident, problem and change management) for the Web Security squads. Help clients with digital transformation through API Management and API Security. Architect solutions with clients, based on best practices. Engineer API security policies and configure API Management. Deploy, configure, tune and monitor API Gateways. Complex policy development using policy studio, used team development feature to support CI-CD workflow, modify environment properties using scripts and configuration studio, created postman collections to support testing Produce customer-facing technical documentation. Assist technical support in troubleshooting customer issues. Proven experience in working collaboratively, coordinating among cross-functional teams and the ability to effectively work with organizational differences and priorities; consistently provide a clear and consistent technical vision to advance project goals. Contributing Responsibilities Recent successful proven experience(s) in similar/comparable scope. Knowledge in web and application servers including Apache, WebSphere, experience with e-Commerce, intranet, and extranet development. Devops Good Understanding of KPS architecture and its relation with Cassandra. Setup API Gateway and API Manager. Basic knowledge JavaScript, or Groovy or of Core java. Technical & Behavioral Competencies Expert knowledge on web access security concepts Strong experience in security of Web Infrastructure in financial services Good and proven knowledge on: Web access management, Web authentication practices, PKI, certificates, OpenID Connect, OAuth, TLS, Federated Identity, Networking principles,… Generic knowledge on: anti-virus, firewalls, application firewalls, load balancers, networks, DMZ,… Experienced in Axway API Gateway Platform for version 7.5.x Should have hands of Knowledge on Axway API Gateway Policy Studio, API Manager, API Portal, API Analytics. Should have knowledge on REST, SOAP WebService Should have worked on oAuth, Open Id Connect, Mutual Authentication, Kerberos Authentication. Should have worked on API design, API Documentation. Should have exposure to create Policies based on data routing, URL rewrite, request and response conversion, IP whitelisting/blacklisting, Throttling, external connection with database (MySQL) etc. Securing the End Point with API Key, OAuth 2.0 (with JWT, Authorization code, client credentials, Implicit), SSL, Basic Authentication. API Gateway CI-CD Implementation using Jenkins and Gitlab. Good working knowledge in Linux environment· Specific Qualifications (if Required) Follows the Customer processes for projects, incident and change management. Being standalone and team worker, analytical minded, meet commitment, ability to work in a dynamic and multi-cultural environment, flexible, customer-oriented, understand risk awareness. Motivated self-starter, process-oriented with high attention to detail Quick self-starter, pro-active attitude. Good communication skills, Good analytical and synthesis skills. Autonomy, commitment, and perseverance. Ability to work in a dynamic and multicultural environment. Flexibility (in peak periods extra efforts may be required). Open minded and show flexibility in self-learning new technologies/tools. You are customer minded and can translate technical issues into non-technical explanations. You are always conscious about continuity of services. You have a very good team spirit and share your knowledge and experience with other members of the team. Working in collaboration with team. Client-oriented, analytical, initiative oriented and able to work independently. Be flexible and ready to provide support outside of Business hours (on-call). Able to take additional responsibility. Able to work from base location Chennai/Mumbai (Whichever is your base location) during hybrid model. You are flexible and ready to provide support outside of Business hours (on-call). Skills Referential Behavioural Skills: (Please select up to 4 skills) Ability to collaborate / Teamwork Decision Making Personal Impact / Ability to influence Organizational skills Transversal Skills: (Please select up to 5 skills) Ability to understand, explain and support change Analytical Ability Ability to manage a project Ability to develop and adapt a process Ability to anticipate business / strategic evolution Education Level Master Degree or equivalent Experience Level At least 5 years

Posted 2 days ago

Apply

0 years

0 Lacs

Visakhapatnam, Andhra Pradesh, India

Remote

Role Description This is a remote freelance/contract-based DevOps role for setting up and managing Magento 2 environments. The ideal candidate will be responsible for deploying Magento 2 on a new domain hosted on a shared VPS environment—without disrupting the existing Magento 2 installation connected to another domain. Key Responsibilities: Install and configure Magento 2 on Hostinger VPS with Ubuntu + CloudPanel Ensure zero downtime or interference with the existing Magento instance Set up domain-specific configurations (vHosts, SSL, PHP versions) Optimize server performance for Magento (Redis, Varnish, Elasticsearch, etc.) Secure the environment with best practices (firewall, file permissions, backups) Implement CI/CD pipelines if required Provide post-deployment testing and support Qualifications Proven experience with Magento 2 DevOps, Linux server management, and deployment pipelines Strong understanding of Hostinger VPS, Ubuntu, and CloudPanel Hands-on expertise with Apache/Nginx, MySQL/MariaDB, PHP-FPM, and Magento CLI Ability to isolate environments for multiple Magento installs on a shared VPS Familiarity with Magento 2 architecture and best deployment practices Experience with SSH, SSL setup, and DNS configuration Strong troubleshooting, debugging, and log analysis skills Excellent communication and documentation abilities Preferred: Magento 2 Developer or DevOps Certification Experience with Git, Docker, or Ansible for automation Prior projects using Hostinger and CloudPanel

Posted 2 days ago

Apply

6.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Position Summary... What you'll do... About the Team: The Data and Customer Analytics Team is a strategic unit dedicated to transforming data into actionable insights that drive customer-centric decision-making across the organization. Our mission is to harness the power of data to understand customer behavior, optimize business performance, and enable personalized experiences. Our team is responsible for building and maintaining a centralized, scalable, and secure data platform that consolidates customer-related data from diverse sources across the organization. This team plays a foundational role in enabling data-driven decision-making, advanced analytics, and personalized customer experiences. This team plays a critical role in building trust with customers by implementing robust privacy practices, policies, and technologies that protect personal information throughout its lifecycle. What You’ll Do Design, build, test and deploy cutting edge solutions at scale, impacting multi-billion-dollar business. Work closely with product owner and technical lead and play a major role in the overall delivery of the assigned project/enhancements. Interact with Walmart engineering teams across geographies to leverage expertise and contribute to the tech community. Provide business insights while leveraging internal tools and systems, databases and industry data. Drive the success of the implementation by applying technical skills, to design and build enhanced processes and technical solutions in support of strategic initiatives. What You’ll Bring 6-9 year's experience in building highly scalable, high performance, responsive web applications. Experience building customizable, reusable, and dynamic API components using Java, NodeJS, Serverless API, RESTful API and Graph QL. Experience with web Java Spring boot API deployment for server-side development with design principles Understanding of RESTful APIs & GraphQL Experience in working in NoSQL databases like Cassandra , Mongo DB etc Strong Work experience in Google Cloud platform services Strong creative, collaboration, and communication skills Ability to multitask between several different requirements and features concurrently. Familiarity with CI/CD, unit testing, automated frontend testing Build high quality code by conducting unit testing and enhancing design to prevent re-occurrences of defects Ability to perform in a team environment. Strong expertise in Java, Spring Boot, Spring MVC, and Spring Cloud. Hands-on experience with Apache Kafka (topics, partitions, consumer groups, Kafka Streams). Solid understanding of microservices architecture and event-driven systems. Experience with RESTful APIs, OAuth, JWT, and API gateways. Proficiency in SQL (PostgreSQL, MySQL, Big Query, Big Lake GCP services) and NoSQL (MongoDB, Cassandra, DynamoDB). Knowledge of Docker, Kubernetes, and cloud platforms (Azure, AWS, or GCP). Strong debugging and performance optimization skills. About Walmart Global Tech Imagine working in an environment where one line of code can make life easier for hundreds of millions of people. That’s what we do at Walmart Global Tech. We’re a team of software engineers, data scientists, cybersecurity expert's and service professionals within the world’s leading retailer who make an epic impact and are at the forefront of the next retail disruption. People are why we innovate, and people power our innovations. We are people-led and tech-empowered. We train our team in the skillsets of the future and bring in experts like you to help us grow. We have roles for those chasing their first opportunity as well as those looking for the opportunity that will define their career. Here, you can kickstart a great career in tech, gain new skills and experience for virtually every industry, or leverage your expertise to innovate at scale, impact millions and reimagine the future of retail. Flexible, hybrid work We use a hybrid way of working with primary in office presence coupled with an optimal mix of virtual presence. We use our campuses to collaborate and be together in person, as business needs require and for development and networking opportunities. This approach helps us make quicker decisions, remove location barriers across our global team, be more flexible in our personal lives. Benefits Beyond our great compensation package, you can receive incentive awards for your performance. Other great perks include a host of best-in-class benefits maternity and parental leave, PTO, health benefits, and much more. Belonging We aim to create a culture where every associate feels valued for who they are, rooted in respect for the individual. Our goal is to foster a sense of belonging, to create opportunities for all our associates, customers and suppliers, and to be a Walmart for everyone. At Walmart, our vision is "everyone included." By fostering a workplace culture where everyone is—and feels—included, everyone wins. Our associates and customers reflect the makeup of all 19 countries where we operate. By making Walmart a welcoming place where all people feel like they belong, we’re able to engage associates, strengthen our business, improve our ability to serve customers, and support the communities where we operate. Equal Opportunity Employer Walmart, Inc., is an Equal Opportunities Employer – By Choice. We believe we are best equipped to help our associates, customers and the communities we serve live better when we really know them. That means understanding, respecting and valuing unique styles, experiences, identities, ideas and opinions – while being inclusive of all people. Minimum Qualifications... Outlined below are the required minimum qualifications for this position. If none are listed, there are no minimum qualifications. Minimum Qualifications:Option 1: Bachelor's degree in computer science, information technology, engineering, information systems, cybersecurity, or related area and 3years’ experience in software engineering or related area at a technology, retail, or data-driven company. Option 2: 5 years’ experience in software engineering or related area at a technology, retail, or data-driven company. Preferred Qualifications... Outlined below are the optional preferred qualifications for this position. If none are listed, there are no preferred qualifications. Certification in Security+, GISF, CISSP, CCSP, or GSEC, Master’s degree in computer science, information technology, engineering, information systems, cybersecurity, or related area and 1 year’s experience leading information security or cybersecurity projects Information Technology - CISCO Certification - Certification Primary Location... BLOCK- 1, PRESTIGE TECH PACIFIC PARK, SY NO. 38/1, OUTER RING ROAD KADUBEESANAHALLI, , India R-2221423

Posted 2 days ago

Apply

4.5 years

0 Lacs

Kochi, Kerala, India

On-site

Job Title: Full Stack Engineer (Java + Angular) Experience: 4.5+ Years Location: Kochi Salary: Up to 15 LPA Employment Type: Full-time Key Responsibilities Design, develop, test, and maintain single-page web applications (SPAs) using Angular, TypeScript, Java, and modern web technologies. Collaborate with cross-functional teams (developers, QA, product owners, designers) to build cutting-edge solutions. Frontend: Angular, TypeScript, Sass, HTML, Node/npm Backend: Java, Spring MVC, REST APIs Databases: PostgreSQL, MongoDB Cloud/DevOps: AWS (Lambda, Aurora), Apache Tomcat Ensure high-quality code through unit testing (JUnit, Jasmine) and best practices. Debug, optimize, and troubleshoot applications to maintain "category killer" status. Configure and manage development environments (IDEs, build tools, CI/CD pipelines). Skills & Qualifications ✅ Must-Have: Bachelor’s degree in Computer Science/Software Engineering (or related field) with a minimum 3.0 GPA. 4.5+ years of hands-on experience in Java and Angular development. Strong expertise in JavaScript/TypeScript, HTML, CSS/Sass, REST APIs, and SQL/NoSQL databases. Experience with multi-layered software architectures and Agile methodologies. Knowledge of build tools (Maven, npm) and version control (Bitbucket). Problem-solving mindset with a dedication to clean, efficient code. ✅ Nice-to-Have (Advantageous Skills): Experience with AWS (Lambda, DynamoDB, Aurora). Familiarity with Jira, Concourse CI/CD, or testing frameworks (Jasmine, JUnit). Knowledge of NoSQL databases (MongoDB, Cassandra).

Posted 2 days ago

Apply

6.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Position: Sr Data Operations Years of Experience – 6-8 Years Job Location: S.B Road –Pune, For other locations (Remote) The Position We are seeking a seasoned engineer with a passion for changing the way millions of people save energy. You’ll work within the Deliver and Operate team to build and improve our platforms to deliver flexible and creative solutions to our utility partners and end users and help us achieve our ambitious goals for our business and the planet. We are seeking a highly skilled and detail-oriented Software Engineer II for Data Operations team to maintain our data infrastructure, pipelines, and work-flows. You will play a key role in ensuring the smooth ingestion, transformation, validation, and delivery of data across systems. This role is ideal for someone with a strong understanding of data engineering and operational best practices who thrives in high-availability environments. Responsibilities & Skills You should: Monitor and maintain data pipelines and ETL processes to ensure reliability and performance. Automate routine data operations tasks and optimize workflows for scalability and efficiency. Troubleshoot and resolve data-related issues, ensuring data quality and integrity. Collaborate with data engineering, analytics, and DevOps teams to support data infrastructure. Implement monitoring, alerting, and logging systems for data pipelines. Maintain and improve data governance, access controls, and compliance with data policies. Support deployment and configuration of data tools, services, and platforms. Participate in on-call rotation and incident response related to data system outages or failures. Required Skills : 5+ years of experience in data operations, data engineering, or a related role. Strong SQL skills and experience with relational databases (e.g., PostgreSQL, MySQL). Proficiency with data pipeline tools (e.g., Apache Airflow). Experience with cloud platforms (AWS, GCP) and cloud-based data services (e.g., Redshift, BigQuery). Familiarity with scripting languages such as Python, Bash, or Shell. Knowledge of version control (e.g., Git) and CI/CD workflows. Qualifications Bachelor's degree in Computer Science, Engineering, Data Science, or a related field. Experience with data observability tools (e.g., Splunk, DataDog). Background in DevOps or SRE with focus on data systems. Exposure to infrastructure-as-code (e.g., Terraform, CloudFormation). Knowledge of streaming data platforms (e.g., Kafka, Spark Streaming).

Posted 2 days ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Title: Specialty Development Senior 34263 Location: Chennai Employment Type: Full-Time (Hybrid) Job Overview We are looking for an experienced GCP Data Engineer to join a global data engineering team responsible for building a sophisticated data warehouse and analytics platform on Google Cloud Platform (GCP) . This role is ideal for professionals with a strong background in data engineering, cloud migration, and large-scale data transformation , particularly within cloud-native environments. Key Responsibilities Design, build, and optimize data pipelines on GCP to support large-scale data transformations and analytics. Lead the migration and modernization of legacy systems to cloud-based architecture. Collaborate with cross-functional global teams to support data-driven applications and enterprise analytics solutions. Work with large datasets to enable platform capabilities and business insights using GCP tools. Ensure data quality, integrity, and performance across the end-to-end data lifecycle. Apply agile development principles to rapidly deliver and iterate on data solutions. Promote engineering best practices in CI/CD, DevSecOps, and cloud deployment strategies. Must-Have Skills GCP Services: BigQuery, Dataflow, Dataproc, Data Fusion, Cloud Composer, Cloud Functions, Cloud SQL, Cloud Spanner, Cloud Storage, Bigtable, Pub/Sub, App Engine, Compute Engine, Airflow Programming & Data Engineering: 5+ years in data engineering and SQL development; experience in building data warehouses and ETL processes Cloud Experience: Minimum 3 years in cloud environments (preferably GCP), implementing production-scale data solutions Strong understanding of data processing architectures (batch/real-time) and tools such as Terraform, Cloud Build, and Airflow Experience with containerized microservices architecture Excellent problem-solving skills and ability to optimize complex data pipelines Strong interpersonal and communication skills with the ability to work effectively in a globally distributed team Proven ability to work independently in high-ambiguity scenarios and drive solutions proactively Preferred Skills GCP Certification (e.g., Professional Data Engineer) Experience in regulated or financial domains Migration experience from Teradata to GCP Programming experience with Python, Java, Apache Beam Familiarity with data governance, security, and compliance in cloud environments Experience coaching and mentoring junior data engineers Knowledge of software architecture, CI/CD, source control (Git), and secure coding standards Exposure to Java full-stack development (Spring Boot, Microservices, React) Agile development experience including pair programming, TDD, and DevSecOps Proficiency in test automation tools like Selenium, Cucumber, REST Assured Familiarity with other cloud platforms like AWS or Azure is a plus Education Bachelor’s Degree in Computer Science, Information Technology, or a related field (mandatory) Skills: python,gcp certification,microservices architecture,terraform,airflow,data processing architectures,test automation tools,sql development,cloud environments,agile development,ci/cd,gcp services: bigquery, dataflow, dataproc, data fusion, cloud composer, cloud functions, cloud sql, cloud spanner, cloud storage, bigtable, pub/sub, app engine, compute engine, airflow,apache beam,git,communication,problem-solving,data engineering,analytics,data,data governance,etl processes,gcp,cloud build,java

Posted 2 days ago

Apply

2.0 years

0 Lacs

India

On-site

The Role We are hiring an AI/ML Developer (India), to join our India team, in support of a large global client! You will be responsible for developing, deploying, and maintaining AI and machine learning models. Your expertise in Python, cloud services, databases, and big data technologies will be instrumental in creating scalable and efficient AI applications. What You Will Be Doing •Develop, train, and deploy machine learning models for predictive analytics, classification, and clustering. •Implement AI-based solutions using frameworks such as TensorFlow, PyTorch, and Scikit-learn. •Work with cloud platforms including AWS (SageMaker, Lambda, S3), Azure, and Google Cloud (Vertex AI). •Integrate and fine-tune Hugging Face transformer models (e.g., BERT, GPT) for NLP tasks such as text classification, summarization, and sentiment analysis. •Develop AI automation solutions, including chatbot implementations using Microsoft Teams and Azure AI. •Work with big data technologies such as Apache Spark and Snowflake for large-scale data processing and analytics. •Design and optimize ETL pipelines for data quality management, transformation, and validation. •Utilize SQL, MySQL, PostgreSQL, and MongoDB for database management and query optimization. •Create interactive data visualizations using Tableau and Power BI to drive business insights. •Work with Large Language Models (LLMs) for AI-driven applications, including fine-tuning, training, and deploying model for conversational AI, text generation, and summarization. •Develop and implement Agentic AI systems, enabling autonomous decision-making AI agents that can adapt, learn, and optimize tasks in real-time. What You Bring Along •2+ years of experience applying AI to practical uses. •Strong programming skills in Python, SQL, and experience with ML frameworks such as TensorFlow, PyTorch, or Scikit-learn. •Knowledge of basic algorithms and object-oriented and functional design principles •Proficiency in using data analytics libraries like Pandas, NumPy, Matplotlib, and Seaborn. •Hands-on experience with cloud platforms such as AWS, Azure, and Google Cloud. •Experience with big data processing using Apache Spark and Snowflake. •Knowledge of NLP and AI model implementations using Hugging Face and cloud-based AI services. •Strong understanding of database management, query optimization, and data warehousing. •Experience with data visualization tools such as Tableau and Power BI. •Ability to work in a collaborative environment and adapt to new AI technologies. •Strong analytical and problem solving skills. Education: •Bachelor’s degree in computer science, Data Science, AI/ML, or a related field.

Posted 2 days ago

Apply

10.0 - 14.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

What Gramener offers you Gramener will offer you an inviting workplace, talented colleagues from diverse backgrounds, career path, steady growth prospects with great scope to innovate. Our goal is to create an ecosystem of easily configurable data applications focused on storytelling for public and private use Cloud Lead – Analytics & Data Products We’re looking for a Cloud Architect/Lead to design, build, and manage scalable AWS infrastructure that powers our analytics and data product initiatives. This role focuses on automating infrastructure provisioning, application/API hosting, and enabling data and GenAI workloads through a modern, secure cloud environment. Roles and Responsibilities Design and provision AWS infrastructure using Terraform or AWS CloudFormation to support evolving data product needs. Develop and manage CI/CD pipelines using Jenkins, AWS CodePipeline, CodeBuild, or GitHub Actions. Deploy and host internal tools, APIs, and applications using ECS, EKS, Lambda, API Gateway, and ELB. Provision and support analytics and data platforms using S3, Glue, Redshift, Athena, Lake Formation, and orchestration tools like Step Functions or Apache Airflow (MWAA). Implement cloud security, networking, and compliance using IAM, VPC, KMS, CloudWatch, CloudTrail, and AWS Config. Collaborate with data engineers, ML engineers, and analytics teams to align infrastructure with application and data product requirements. Support GenAI infrastructure, including Amazon Bedrock, SageMaker, or integrations with APIs like OpenAI. Skills and Qualifications: 10-14 years of experience in cloud engineering, DevOps, or cloud architecture roles. Hands-on expertise with the AWS ecosystem and tools listed above. Proficiency in scripting (e.g., Python, Bash) and infrastructure automation. Experience deploying containerized workloads using Docker, ECS, EKS, or Fargate. Familiarity with data engineering and GenAI workflows is a plus. AWS certifications (e.g., Solutions Architect, DevOps Engineer) are preferred.

Posted 2 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

We are looking for talented and experienced Java Full Stack Developers to join our dynamic team at a reputed client location in Pune. If you're passionate about backend and frontend technologies, system design, and DevOps practices, this is your opportunity to make an impact! Qualification:- Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field. Required Skills:- Strong hands-on experience with Java , Spring Framework , and Spring Boot Expertise in building and consuming RESTful APIs Solid frontend development skills using React.js Proficiency in PostgreSQL and Apache Kafka Experience with CI/CD tools : Chef, Jenkins, Maven, SonarQube, Checkmarx Deep understanding of High & Low-Level System Design Experience in Domain-Driven Design and Event-Driven Architecture Strong debugging, performance tuning, and troubleshooting capabilities Excellent communication skills to collaborate with technical and business stakeholders Proven ability to lead, mentor, and coach development teams Job Role & Responsibilities:- Develop, enhance, and maintain fullstack applications using Java and React Design and implement robust, scalable, and secure backend services Integrate systems using Apache Kafka and manage event-driven workflows Ensure code quality with continuous integration, code reviews, and static analysis tools Participate in system design discussions and contribute to architecture decisions Lead and mentor junior developers, promote coding standards and best practices Collaborate with cross-functional teams, including product managers and architects Troubleshoot issues across the stack and optimize application performance Note: This is a Work from Office (5 Days a Week) opportunity at the client location in Yerwada, Pune .

Posted 2 days ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Key Responsibilities: Design, develop, and maintain high-performance ETL and real-time data pipelines using Apache Kafka and Apache Flink. Build scalable and automated MLOps pipelines for model training, validation, and deployment using AWS SageMaker and related services. Implement and manage Infrastructure as Code (IaC) using Terraform for AWS provisioning and maintenance. Collaborate with ML, Data Science, and DevOps teams to ensure reliable and efficient model deployment workflows. Optimize data storage and retrieval strategies for both structured and unstructured large-scale datasets. Integrate and transform data from multiple sources into data lakes and data warehouses. Monitor, troubleshoot, and improve performance of cloud-native data systems in a fast-paced production setup. Ensure compliance with data governance, privacy, and security standards across all data operations. Document data engineering workflows and architectural decisions for transparency and maintainability. Requirements 5+ Years of experience as Data Engineer or in similar role Proven experience in building data pipelines and streaming applications using Apache Kafka and Apache Flink. Strong ETL development skills, with deep understanding of data modeling and data architecture in large-scale environments. Hands-on experience with AWS services, including SageMaker, S3, Glue, Lambda, and CloudFormation or Terraform. Proficiency in Python and SQL; knowledge of Java is a plus, especially for streaming use cases. Strong grasp of MLOps best practices, including model versioning, monitoring, and CI/CD for ML pipelines. Deep knowledge of IaC tools, particularly Terraform, for automating cloud infrastructure. Excellent analytical and problem-solving abilities, especially with regard to data processing and deployment issues. Agile mindset with experience working in fast-paced, iterative development environments. Strong communication and team collaboration skills.

Posted 2 days ago

Apply

12.0 years

0 Lacs

Pune, Maharashtra, India

On-site

About the Role: We are seeking a seasoned and adaptable Senior Software Engineer / Technical Lead with 8–12 years of experience in software development. You will be a key contributor in designing and building scalable, robust, and high-performance systems. The ideal candidate has deep expertise in Java or .NET , strong understanding of microservices architecture , and hands-on experience with streaming platforms, databases , and a test-first development mindset . Key Responsibilities: Design, develop, and maintain enterprise-grade applications using Java or .NET frameworks. Architect and implement microservices and REST APIs, ensuring modularity, scalability, and performance. Work with relational (RDBMS) and big data technologies to manage large-scale datasets. Integrate and leverage streaming platforms such as Apache Kafka for real-time data processing. Apply strong software design principles and follow test-first / TDD approaches to deliver clean, maintainable code. Collaborate with UI/UX and front-end teams to ensure seamless end-to-end product experience. Lead or contribute to code reviews, architecture discussions, and mentorship of junior engineers. Stay current with emerging technologies and be open to adopting new tools, languages, or frameworks as needed. Required Qualifications: 3 - 6 years of hands-on software development experience. Strong command over Java or .NET technologies and related ecosystems. Experience with RDBMS (e.g., MySQL, PostgreSQL) and big data platforms (e.g., Hadoop, Spark). Proficient with Apache Kafka or similar streaming technologies. Deep understanding of software architecture patterns, particularly microservices. Practical experience with RESTful services and API design. Familiarity with UI technologies (e.g., JavaScript, Angular, React) and front-end/backend integration. Demonstrated use of test-first methodologies (TDD, BDD, unit testing frameworks). Excellent problem-solving and communication skills. Proven ability to learn and adapt quickly to new technologies and frameworks. Nice to Have: Experience with cloud platforms such as AWS, Azure, or GCP. Exposure to DevOps practices and CI/CD tools. Background in containerization (Docker, Kubernetes). Maersk is committed to a diverse and inclusive workplace, and we embrace different styles of thinking. Maersk is an equal opportunities employer and welcomes applicants without regard to race, colour, gender, sex, age, religion, creed, national origin, ancestry, citizenship, marital status, sexual orientation, physical or mental disability, medical condition, pregnancy or parental leave, veteran status, gender identity, genetic information, or any other characteristic protected by applicable law. We will consider qualified applicants with criminal histories in a manner consistent with all legal requirements. We are happy to support your need for any adjustments during the application and hiring process. If you need special assistance or an accommodation to use our website, apply for a position, or to perform a job, please contact us by emailing accommodationrequests@maersk.com.

Posted 2 days ago

Apply

8.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Engineering at Innovaccer With every line of code, we accelerate our customers' success, turning complex challenges into innovative solutions. Collaboratively, we transform each data point we gather into valuable insights for our customers. Join us and be part of a team that's turning dreams of better healthcare into reality, one line of code at a time. Together, we’re shaping the future and making a meaningful impact on the world. About the Role We’re on a mission to completely change the way healthcare works by building the most powerful Healthcare Intelligence Platform (Gravity) ever made. Using an AI-first approach , our goal is to turn complicated health data into real-time insights that help hospitals, clinics, pharmaceutical companies, and researchers make faster, smarter decisions. We're building a unified platform from the ground up — specifically for healthcare . This platform will bring together everything from: Collecting data from different systems (Data Acquisition) Combining and cleaning it (Integration, Data Quality) Managing patient records and provider info (Master Data Management) Tagging and organizing it (Data Classification & Governance) Running analytics and building AI models (Analytics, AI Studio) Creating custom healthcare apps (App Marketplace) Using AI as a built-in assistant (AI as BI + Agent-first approach) This platform will let healthcare teams build solutions quickly — without starting from scratch each time. For example, they’ll be able to: Track and manage kidney disease patients across different hospitals Speed up clinical trials by analyzing real-world patient data Help pharmacies manage their stock better with predictive supply chain tools Detect early signs of diseases like diabetes or cancer with machine learning Ensure regulatory compliance automatically through built-in checks This is a huge, complex, and high-impact challenge , and we’re looking for a Staff Engineer to help lead the way. In this role, you’ll: Design and build scalable, secure, and reliable systems Create core features like data quality checks , metadata management , data lineage tracking , and privacy/compliance layers Work closely with other engineers, product managers, and healthcare experts to bring the platform to life If you're passionate about using technology to make a real difference in the world — and enjoy solving big engineering problems — we'd love to connect. A Day in the Life Architect, design, and build scalable data tools and frameworks. Collaborate with cross-functional teams to ensure data compliance, security, and usability. Lead initiatives around metadata management, data lineage, and data cataloging. Define and evangelize standards and best practices across data engineering teams. Own the end-to-end lifecycle of tooling – from prototyping to production deployment. Mentor and guide junior engineers and contribute to technical leadership across the organization. Drive innovation in privacy-by-design, regulatory compliance (e.g., HIPAA), and data observability solutions. What You Need 8+ years of experience in software engineering with strong experience building distributed systems. Proficient in backend development (Python, Java, or Scala or Go) and familiar with RESTful API design. Expertise in modern data stacks: Kafka, Spark, Airflow, Snowflake etc. Experience with open-source data governance frameworks like Apache Atlas, Amundsen, or DataHub is a big plus. Familiarity with cloud platforms (AWS, Azure, GCP) and their native data governance offerings. Bachelor's or Master’s degree in Computer Science, Engineering, or a related field. Here’s What We Offer Generous Leaves: Enjoy generous leave benefits of up to 40 days. Parental Leave : Leverage one of industry's best parental leave policies to spend time with your new addition. Sabbatical : Want to focus on skill development, pursue an academic career, or just take a break? We've got you covered. Health Insurance: We offer comprehensive health insurance to support you and your family, covering medical expenses related to illness, disease, or injury. Extending support to the family members who matter most. Care Program: Whether it’s a celebration or a time of need, we’ve got you covered with care vouchers to mark major life events. Through our Care Vouchers program, employees receive thoughtful gestures for significant personal milestones and moments of need. Financial Assistance : Life happens, and when it does, we’re here to help. Our financial assistance policy offers support through salary advances and personal loans for genuine personal needs, ensuring help is there when you need it most. Innovaccer is an equal-opportunity employer. We celebrate diversity, and we are committed to fostering an inclusive and diverse workplace where all employees, regardless of race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, marital status, or veteran status, feel valued and empowered. Disclaimer: Innovaccer does not charge fees or require payment from individuals or agencies for securing employment with us. We do not guarantee job spots or engage in any financial transactions related to employment. If you encounter any posts or requests asking for payment or personal information, we strongly advise you to report them immediately to our HR department at px@innovaccer.com. Additionally, please exercise caution and verify the authenticity of any requests before disclosing personal and confidential information, including bank account details. About Innovaccer Innovaccer activates the flow of healthcare data, empowering providers, payers, and government organizations to deliver intelligent and connected experiences that advance health outcomes. The Healthcare Intelligence Cloud equips every stakeholder in the patient journey to turn fragmented data into proactive, coordinated actions that elevate the quality of care and drive operational performance. Leading healthcare organizations like CommonSpirit Health, Atlantic Health, and Banner Health trust Innovaccer to integrate a system of intelligence into their existing infrastructure, extending the human touch in healthcare. For more information, visit www.innovaccer.com. Check us out on YouTube , Glassdoor , LinkedIn , Instagram , and the Web .

Posted 2 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies