Jobs
Interviews

18519 Tuning Jobs - Page 41

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

1.0 - 2.0 years

3 - 3 Lacs

Panchkula

On-site

Job Summary: As an Associate DevOps Engineer, you will be responsible for setting up and maintaining the infrastructure needed to deploy our projects. This includes managing domains and DNS, deploying MERN stack applications, Python projects, and WordPress sites, and ensuring smooth operation on both Google Cloud and AWS. Key Responsibilities: Configure and manage domains, DNS, and SSL certificates. Set up and deploy MERN stack applications. Deploy and maintain Python-based projects. Manage and deploy WordPress sites. Utilize Google Cloud and AWS for deployment and management of resources. Implement CI/CD pipelines to automate deployments. Monitor and maintain production systems to ensure reliability and performance. Collaborate with development teams to streamline deployment processes. Troubleshoot and resolve infrastructure issues as they arise. Document processes, configurations, and infrastructure setups. Install, configure, and maintain Linux servers and workstations. Manage user accounts, permissions, and access controls. Perform system monitoring, performance tuning, and optimization. Troubleshoot and resolve system and network issues. Apply OS patches, security updates, and system upgrades. Implement and maintain backup and disaster recovery solutions. Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field with 1-2 years of experience. Basic understanding of networking concepts, including DNS, domains, and SSL. Familiarity with MERN stack (MongoDB, Express.js, React.js, Node.js). Basic knowledge of Python and its deployment practices. Understanding of WordPress setup and deployment. Exposure to cloud platforms, particularly Google Cloud and AWS. Basic knowledge of CI/CD pipelines and tools like Jenkins, GitHub Actions, or similar. Strong problem-solving skills and attention to detail. Good communication skills and ability to work collaboratively in a team environment. Preferred Skills: Experience with containerization tools like Docker. Familiarity with Infrastructure as Code (IaC) tools like Terraform or CloudFormation. Understanding of version control systems, especially Git. Basic scripting skills in Shell, Python, or similar languages. Exposure to monitoring tools like Prometheus, Grafana, or similar. Additional Requirements: Proficiency in Docker and containerization. Shell scripting skills. Knowledge of Linux systems and administration. Proficiency in Git and version control. Job Type: Full-time Pay: ₹25,000.00 - ₹30,000.00 per month Benefits: Provident Fund Work Location: In person

Posted 6 days ago

Apply

1.0 - 3.0 years

2 - 4 Lacs

India

On-site

Job Description: WordPress Developer (with PHP Knowledge) Location: Zirakpur Shift: Night Experience: 1 to 3 Years We are looking for a skilled and passionate WordPress Developer with a solid understanding of PHP to join our team. The ideal candidate will have experience developing and maintaining WordPress websites and be proficient in PHP programming to create custom themes, plugins, and handle site optimization tasks. Key Responsibilities: Develop, customize, and maintain WordPress websites and applications. Write clean, efficient, and well-documented PHP code. Create custom WordPress themes and plugins based on project requirements. Troubleshoot and debug issues, ensuring websites are functioning optimally. Collaborate with design and content teams to integrate designs into the WordPress platform. Optimize websites for speed, performance, and security. Stay up to date with the latest trends in WordPress development and technologies. Implement and maintain SEO best practices for WordPress websites. Required Skills & Qualifications: 1 to 3 years of experience in WordPress development. Proficient in PHP, MySQL, HTML, CSS, and JavaScript. Strong knowledge of WordPress core, themes, plugins, and APIs. Experience with version control tools such as Git. Familiarity with website optimization techniques (speed, security, SEO). Ability to work independently and in a team, with strong problem-solving skills. Experience with responsive design and cross-browser compatibility. Strong communication and collaboration skills. Preferred Skills: Experience with website migration, performance tuning, and hosting environments. Familiarity with eCommerce platforms like WooCommerce. Why Join Us? Competitive salary. Dynamic and supportive work environment. Opportunities for career growth and skill enhancement. Work in a fast-paced, innovative team. How to Apply: If you meet the above qualifications and are ready to take on new challenges, we'd love to hear from you. Apply now with your updated resume! Job Type: Full-time Pay: ₹20,000.00 - ₹40,000.00 per month Benefits: Food provided Paid sick time Application Question(s): Are you an immediate joiner? Education: Bachelor's (Preferred) Experience: WordPress: 2 years (Required) PHP: 2 years (Required) Location: Zirakpur, Punjab (Required) Work Location: In person

Posted 6 days ago

Apply

3.0 - 4.0 years

3 - 6 Lacs

India

On-site

Job Title: Python Backend Developer (Data Layer) Location: Mohali, Punjab Company: RevClerx About RevClerx: RevClerx Pvt. Ltd., founded in 2017 and based in the Chandigarh/Mohali area (India), is a dynamic Information Technology firm providing comprehensive IT services with a strong focus on client-centric solutions. As a global provider, we cater to diverse business needs including website designing and development, digital marketing, lead generation services (including telemarketing and qualification), and appointment setting. Job Summary: We are seeking a skilled Python Backend Developer with a strong passion and proven expertise in database design and implementation. This role requires 3-4 years of backend development experience, focusing on building robust, scalable applications and APIs. The ideal candidate will not only be proficient in Python and common backend frameworks but will possess significant experience in designing, modeling, and optimizing various database solutions, including relational databases (like PostgreSQL) and, crucially, graph databases (specifically Neo4j). You will play a vital role in architecting the data layer of our applications, ensuring efficiency, scalability, and the ability to handle complex, interconnected data. Key Responsibilities: ● Design, develop, test, deploy, and maintain scalable and performant Python-based backend services and APIs. ● Lead the design and implementation of database schemas for relational (e.g., PostgreSQL) and NoSQL databases, with a strong emphasis on Graph Databases (Neo4j). ● Model complex data relationships and structures effectively, particularly leveraging graph data modeling principles where appropriate. ● Write efficient, optimized database queries (SQL, Cypher, potentially others). ● Develop and maintain data models, ensuring data integrity, consistency, and security. ● Optimize database performance through indexing strategies, query tuning, caching mechanisms, and schema adjustments. ● Collaborate closely with product managers, frontend developers, and other stakeholders to understand data requirements and translate them into effective database designs. ● Implement data migration strategies and scripts as needed. ● Integrate various databases seamlessly with Python backend services using ORMs (like SQLAlchemy, Django ORM) or native drivers. ● Write unit and integration tests, particularly focusing on data access and manipulation logic. ● Contribute to architectural decisions, especially concerning data storage, retrieval, and processing. ● Stay current with best practices in database technologies, Python development, and backend systems. Minimum Qualifications: ● Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field, OR equivalent practical experience. ● 3-4 years of professional software development experience with a primary focus on Python backend development. ● Strong proficiency in Python and its standard libraries. ● Proven experience with at least one major Python web framework (e.g., Django, Flask, FastAPI). ● Demonstrable, hands-on experience designing, implementing, and managing relational databases (e.g., PostgreSQL). ● Experience with at least one NoSQL database (e.g., MongoDB, Redis, Cassandra). ● Solid understanding of data structures, algorithms, and object-oriented programming principles. ● Experience designing and consuming RESTful APIs. ● Proficiency with version control systems, particularly Git. ● Strong analytical and problem-solving skills, especially concerning data modeling and querying. ● Excellent communication and teamwork abilities. Preferred (Good-to-Have) Qualifications: ● Graph Database Expertise: ○ Significant, demonstrable experience designing and implementing solutions using Graph Databases (Neo4j strongly preferred). ○ Proficiency in graph query languages, particularly Cypher. ○ Strong understanding of graph data modeling principles, use cases (e.g., recommendation engines, fraud detection, knowledge graphs, network analysis), and trade-offs. ● Advanced Database Skills: ○ Experience with database performance tuning and monitoring tools. ○ Experience with Object-Relational Mappers (ORMs) like SQLAlchemy or Django ORM in depth. ○ Experience implementing data migration strategies for large datasets. ● Cloud Experience: Familiarity with cloud platforms (e.g., AWS, Azure, Google Cloud Platform) and their managed database services (e.g., RDS, Aurora, Neptune, DocumentDB, MemoryStore). ● Containerization & Orchestration: Experience with Docker and Kubernetes. ● Asynchronous Programming: Experience with Python's asyncio and async frameworks. ● Data Pipelines: Familiarity with ETL processes or data pipeline tools (e.g., Apache Airflow). ● Testing: Experience writing tests specifically for database interactions and data integrity. What We Offer: ● Challenging projects with opportunities to work on cutting-edge technologies especially in the field of AI. ● Competitive salary and comprehensive benefits package. ● Opportunities for professional development and learning (e.g., conferences, courses, certifications). ● A collaborative, innovative, and supportive work environment. How to Apply: Interested candidates are invited to submit their resume and a cover letter outlining their relevant experience, specifically highlighting their database design expertise (including relational, NoSQL, and especially Graph DB/Neo4j experience) Job Types: Full-time, Permanent Pay: ₹30,000.00 - ₹55,373.94 per month Benefits: Food provided Health insurance Schedule: Day shift Monday to Friday

Posted 6 days ago

Apply

2.0 years

3 - 4 Lacs

Mohali

On-site

We are looking for a highly motivated GenAI Engineer with strong hands-on experience working with Large Language Models (LLMs), Retrieval-Augmented Generation (RAG) workflows, and production-ready AI applications. You’ll help design, build, and extend digital products and creative applications that leverage the latest in LLM technologies. You will play a lead role in product development & offering AI services to clients, client onboarding, and delivery of cutting-edge AI solutions, working with a range of modern AI tools, cloud services, and frameworks. Experience: 2 + Years Location: Mohali, Punjab Work Mode: On-site Timings: 10:00 AM – 7:00 PM (Day Shift) Interview Mode: Face-to-Face ( On-Site) Contact: +91-9872993778 (Mon–Fri, 11 AM – 6 PM) Key Responsibilities: Design and implement generative AI solutions using large language models (LLMs), natural language processing (NLP), and computer vision. Develop, enhance, and scale digital products leveraging LLMs at their core. Lead product development and operations teams to implement GenAI-based solutions. Design and manage client onboarding, rollout, and adoption strategies. Deliver and maintain enhancements based on client-specific needs. Build and maintain RAG pipelines and LLM-based workflows for enterprise applications. Manage LLMOps processes across the entire AI lifecycle (prompt design, fine-tuning, evaluation). Work with cloud-based GenAI platforms (primarily Azure OpenAI, but also Google, AWS, etc.). Implement API integrations, orchestration, and workflow automation. Evaluate, fine-tune, and monitor performance of LLM outputs using observability tools. Required Qualifications: Bachelor’s degree in Computer Science, Artificial Intelligence, Machine Learning, or a related field — or equivalent hands-on experience. Minimum 2 years of hands-on experience in software development or applied machine learning. Programming (preferred): Python, JavaScript. Voice AI: ElevenLabs, Twilio, understanding of ASR (Automatic Speech Recognition) and NLU (Natural Language Understanding). Automation/Integration: n8n (or Make.com, Zapier, Activepieces), API integration (RESTful APIs, Webhooks), JSON. Proficiency in Azure AI services, including: Azure OpenAI (GPT-4, Codex, etc. Azure Machine Learning for model development and deployment Proven experience with LLM APIs (OpenAI, Azure OpenAI, Gemini, Claude, etc.). Solid hands-on experience in building and deploying RAG pipelines. Proficiency in Python and strong knowledge of Python ecosystems and libraries. Familiarity with core GenAI frameworks: LangChain, LangGraph , LlamaIndex, etc. Experience with vector databases: FAISS, Milvus, Azure AI Search, etc. Practical knowledge of embeddings, model registries (e.g., Hugging Face), and LLM APIs. Experience in prompt engineering, tool/function calling, and structured outputs (Pydantic/JSON Schema). Exposure to LLM observability tools: LangSmith, LangFuse , etc. Strong Git, API, and cloud platform (AWS, GCP, Azure) experience. Job Type: Full-time Pay: ₹25,000.00 - ₹40,000.00 per month Schedule: Day shift Monday to Friday Experience: Gen AI Engineer: 2 years (Preferred) Work Location: In person

Posted 6 days ago

Apply

6.0 years

15 - 18 Lacs

Indore

On-site

Location: Indore Experience: 6+ Years Work Type : Hybrid Notice Period : 0-30 Days joiners We are hiring for a Digital Transformation Consulting firm that specializes in the Advisory and implementation of AI, Automation, and Analytics strategies for the Healthcare providers. The company is headquartered in NJ, USA and its India office is in Indore, MP. Job Description: We are seeking a highly skilled Tech Lead with expertise in database management, data warehousing, and ETL pipelines to drive the data initiatives in the company. The ideal candidate will lead a team of developers, architects, and data engineers to design, develop, and optimize data solutions. This role requires hands-on experience in database technologies, data modeling, ETL processes, and cloud-based data platforms. Key Responsibilities: Lead the design, development, and maintenance of scalable database, data warehouse, and ETL solutions. Define best practices for data architecture, modeling, and governance. Oversee data integration, transformation, and migration strategies. Ensure high availability, performance tuning, and optimization of databases and ETL pipelines. Implement data security, compliance, and backup strategies. Required Skills & Qualifications: 6+ years of experience in database and data engineering roles. Strong expertise in SQL, NoSQL, and relational database management systems (RDBMS). Hands-on experience with data warehousing technologies (e.g., Snowflake, Redshift, BigQuery). Deep understanding of ETL tools and frameworks (e.g., Apache Airflow, Talend, Informatica). Experience with cloud data platforms (AWS, Azure, GCP). Proficiency in programming/scripting languages (Python, SQL, Shell scripting). Strong problem-solving, leadership, and communication skills. Preferred Skills (Good to Have): Experience with big data technologies (Hadoop, Spark, Kafka). Knowledge of real-time data processing. Exposure to AI/ML technologies and working with ML algorithms Job Types: Full-time, Permanent Pay: ₹1,500,000.00 - ₹1,800,000.00 per year Schedule: Day shift Application Question(s): We must fill this position urgently. Can you start immediately? Have you held a lead role in the past? Experience: Extract, Transform, Load (ETL): 6 years (Required) Python: 5 years (Required) big data technologies (Hadoop, Spark, Kafka): 6 years (Required) Snowflake: 6 years (Required) Data warehouse: 6 years (Required) Location: Indore, Madhya Pradesh (Required) Work Location: In person

Posted 6 days ago

Apply

1.0 years

1 - 3 Lacs

Ahmedabad

On-site

Profile: junior Android Developer Experience: 1 to 3 year Skills: Android App Development, Java & Kotlin, Android SDK, UI/UX Optimization, RESTful APIs Integration, Bug Fixing & Code Maintenance, Google Play Store Deployment, Offline Storage & Threading, Git / SVN / Mercurial, Third-Party Libraries Integration, Push Notifications & Cloud Messaging, Performance Tuning, Strong English Communication etc... Salary: Up To 30k Location: Ahmedabad Apply Now - career.itjobsvale@gmail.com +91 7211188810 Job Type: Full-time Pay: ₹15,000.00 - ₹30,000.00 per month Work Location: In person

Posted 6 days ago

Apply

0 years

4 - 8 Lacs

Ahmedabad

On-site

Are you passionate and driven enough to fill our house? We’ve got first class rooms … your challenge is to fill them. Then why not come and join us at the Radisson Hotel Group to Make Every Moment Matter! where our guests can relax and enjoy the experience! Our Reservations Team are natural organizers, sales driven with finicky attention to detail and totally tuning into guests needs. They are first class and strive to deliver a hospitality experience that is beyond expectation - creating memorable moments for our guests. As Revenue Manager, you will join a team that is passionate about delivering exceptional service where we believe that anything is possible, whilst having fun in all that we do! Interested then why not say Yes I Can! as we are looking for passionate people just like you! Key Responsibilities of the Revenue Manager: Supports the smooth running of the revenue management department, where all aspects of the hotel’s reservations and meeting & events enquiries are managed and handled Works proactively to maximize guest satisfaction and comfort, delivering a positive and responsive approach to enquiries and problem resolution Develops and implements strategies where key revenue management metrics are identified, communicated and delivered Effectively manages the life cycle of the team within the department, fostering a culture of growth, development and performance Responsible for the departmental budget, ensuring that costs and inventory are controlled, that productivity and performance levels are attained Builds and maintains effective working relationships with all key stakeholders Reviews and scrutinizes the business performance, providing recommendations that will drive financial performance Ensures adherence and compliance to all legislation where due diligence requirements and best practice activities are planned, delivered and documented for internal and external audit, performing follow-up as required Requirements of the Revenue Manager: Proven experience in revenue management with excellent problem-solving capabilities Excellent managerial skills with a hands-on approach and lead-by-example work style Commitment to exceptional guest service with a passion for the hospitality industry Ability to find creative solutions, offering advice and recommendations Personal integrity, with the ability to work in an environment that demands excellence, time and energy Experienced in using IT systems on various platforms Strong communication skills CAREERS Join us in our mission to make every moment matter for our guests and be part of the most inspired hotel company in the world. At Radisson Hotel Group we believe that people are our number one asset. As one of the world’s largest hotel companies, we are always looking for great people to join our team. If this sounds like an ambition you share, then start with us. To find out more about the Radisson Hotel Group, our Culture and Beliefs, then why not visit us at careers.radissonhotels.com. INDHOTEL

Posted 6 days ago

Apply

0 years

7 - 8 Lacs

Vadodara

On-site

Summary of the position The MySQL Database Administrator (DBA) will be based in our PMC India Office in Vadodara, task managed and supervised by the Database Service Manager. You will be responsible for carrying out maintenance & support of multiple enterprises, mission-critical Database Server. Resolution of incidents & problems, root cause analysis (RCA) leading to recommending and performing change activities concerning the databases and interfaced applications. You will be expected to perform, as required, various ad hoc database project activities. The role extends to defining and operating scheduled housekeeping activities, defining, recommending, and implementing monitoring and alerting processes. Supporting solution architects and developers on test, UAT and production environments. This DBA will participate in a 24/7 OOH schedule including bank holidays – as the team grows this will then move to an on the desk 24/7 & BHs as a standard shift pattern. Your passion for delivering a high degree of customer service, technical expertise, diligence and timeliness is vital. As a DBA, you will need to be articulate, advocating accurate and comprehensive solutions to system problems & requirements. You will work as part of a small team of off-shore DBAs to implement effective 24/7 support, monitoring & alerting services utilising our PMC India Office in Vadodara. Key Accountabilities Provide reactive support, adhering to fast response and resolution deadlines, in the event of an unplanned interruption to the customer’s provided services. Support services are defined as any application which has a dependency on a database. Lead the resolution of incidents raised as part of the PMC resolver group adhering to PMC’s contractual obligations regarding SLA performance. Provide daily database administration activities including, but not limited to: Housekeeping including the creation, implementation, and ongoing maintenance of maintenance plans for the efficient running of a database, and any associated application with a dependency on the database. Monitoring and alerting of MySQL Database Instance. The review, creation, implementation and maintenance of monitoring parameters enabling proactive database monitoring and, importantly, issue prevention. Operating system and application configuration recommendations for optimising the supported databases to maximise effective and efficient operation. Database replication, best practices, and support of existing operational systems. Database backup and DR processes – to create, maintain and monitor. Perform Change Management activities to include but not limited to: Change assessments for all database-related changes. To support the customer in change testing. Deploying and rolling back all database changes for projects and BAU fixes on the production databases. Provide project-based activities at agreed schedule times to include but not limited to: Develop, modify any database objects as required by the project. Upgrade databases to newer versions. Reviewing database scripts written by developers. Advice on peripheral OS configurations or capacity parameters as appropriate. Design database schemas in coordination with the customer’s data architecture principles and cooperation with the customer’s data architect, including any implementation or upgrade of database platforms. Skills and Experience | Essential Good experience as MySQL DBA on Installation, configuration and upgrading of MySQL Server Ensure integrity, Availability, and performance of MySQL database systems by providing technical support and maintenance Knowledge on MySQL DB architecture Implement and maintain database security (create and maintain users and roles, assign privileges) Perform troubleshooting and maintenance of multiple databases. Monitor databases regularly to check for any errors such as existing locks and failed updates Good experience on managing regular backups, recovery and PITR of databases Oversee the File system alerts and utilization of data and log files. Responsible for regular maintenance on databases Proficiency in MySQL database performance tuning and optimization (Query optimization, Indexing etc.) Experience on Data movement utilities like Export/load/Dump Identify and recommend database best practices to support business needs Shell scripting knowledge for automation tasks and monitoring alerts Able to support 24/7 rotation Skills and Experience | Desirable MySQL Certification (e.g. Oracle certified MySQL Database Administrator) Experience and working knowledge of other RDBMS systems, especially AWS Aurora. Exposure to SQL Azure Exposure to Linux Operating Systems

Posted 6 days ago

Apply

12.0 years

0 Lacs

Noida

On-site

About Aeris: For more than three decades, Aeris has been a trusted cellular IoT leader enabling the biggest IoT programs and opportunities across Automotive, Utilities and Energy, Fleet Management and Logistics, Medical Devices, and Manufacturing. Our IoT technology expertise serves a global ecosystem of 7,000 enterprise customers and 30 mobile network operator partners, and 80 million IoT devices across the world. Aeris powers today’s connected smart world with innovative technologies and borderless connectivity that simplify management, enhance security, optimize performance, and drive growth. Built from the ground up for IoT and road-tested at scale, Aeris IoT Services are based on the broadest technology stack in the industry, spanning connectivity up to vertical solutions. As veterans of the industry, we know that implementing an IoT solution can be complex, and we pride ourselves on making it simpler. Our company is in an enviable spot. We’re profitable, and both our bottom line and our global reach are growing rapidly. We’re playing in an exploding market where technology evolves daily and new IoT solutions and platforms are being created at a fast pace. A few things to know about us: We put our customers first . When making decisions, we always seek to do what is right for our customer first, our company second, our teams third, and individual selves last. We do things differently. As a pioneer in a highly competitive industry that is poised to reshape every sector of the global economy, we cannot fall back on old models. Rather, we must chart our own path and strive to out-innovate, out-learn, out-maneuver and out-pace the competition on the way. We walk the walk on diversity. We’re a brilliant and eclectic mix of ethnicities, religions, industry experiences, sexual orientations, generations and more – and that’s by design. We see diverse perspectives as a core competitive advantage. Integrity is essential. We believe in doing things well – and doing them right. Integrity is a core value here: you’ll see it embodied in our staff, our management approach and growing social impact work (we have a VP devoted to it). You’ll also see it embodied in the way we manage people and our HR issues: we expect employees and managers to deal with issues directly, immediately and with the utmost respect for each other and for the Company. We are owners. Strong managers enable and empower their teams to figure out how to solve problems. You will be no exception, and will have the ownership, accountability and autonomy needed to be truly creative. Job Title: Senior Oracle Database Administrator (DBA) – GCP Location: Noida, India We are seeking a highly skilled and experienced Senior Oracle DBA to manage and maintain our critical Oracle 12c, 18c, 19c, 21c single instance with DG and RAC databases, hosted on Google Cloud Platform (GCP). The ideal candidate will possess deep expertise in Oracle database administration, including installation, configuration, patching, performance tuning, security, and backup/recovery strategies within a cloud environment. They will also have expertise and experience optimizing the underlying operating system and database parameters for maximum performance and stability. Responsibilities: Database Administration: Install, configure, and maintain Oracle 12c, 18c, 19c, 21c single instance with DG and RAC databases on GCP Compute Engine. Implement and manage Oracle Data Guard for high availability and disaster recovery, including switchovers, failovers, and broker configuration. Perform database upgrades, patching, and migrations. Develop and implement backup and recovery strategies, including RMAN configuration and testing. Monitor database performance and proactively identify and resolve performance bottlenecks. Troubleshoot database issues and provide timely resolution. Implement and maintain database security measures, including user access control, auditing, and encryption. Automate routine database tasks using scripting languages (e.g., Shell, Python, PL/SQL). Create and maintain database documentation. Database Parameter Tuning: In-depth knowledge of Oracle database initialization parameters and their impact on performance, with a particular focus on memory management parameters. Expertise in tuning Oracle memory structures (SGA, PGA) for optimal performance in a GCP environment. This includes: Precisely sizing the SGA components (Buffer Cache, Shared Pool, Large Pool, Java Pool, Streams Pool) based on workload characteristics and available GCP Compute Engine memory resources. Optimizing PGA allocation (PGA_AGGREGATE_TARGET, PGA_AGGREGATE_LIMIT) to prevent excessive swapping and ensure efficient SQL execution. Understanding the interaction between SGA and PGA memory regions and how they are affected by GCP instance memory limits. Tuning the RESULT_CACHE parameters for optimal query performance, considering the available memory and workload patterns. Proficiency in using Automatic Memory Management (AMM) and Automatic Shared Memory Management (ASMM) features and knowing when manual tuning is required for optimal results. Knowledge of how GCP instance memory limits can impact Oracle's memory management and the appropriate adjustments to make. Experience with analysing AWR reports and identifying areas for database parameter optimization, with a strong emphasis on identifying memory-related bottlenecks (e.g., high buffer busy waits, excessive direct path reads/writes). Proficiency in tuning SQL queries using tools like SQL Developer and Explain Plan, particularly identifying queries that consume excessive memory or perform inefficient memory access patterns. Knowledge of Oracle performance tuning methodologies and best practices, specifically as they apply to memory management in a cloud environment. Experience with database indexing strategies and index optimization, understanding the impact of indexes on memory utilization. Solid understanding of Oracle partitioning and its benefits for large databases, including how partitioning can affect memory usage and query performance. Ability to perform proactive performance tuning based on workload analysis and trending, with a focus on memory usage patterns and potential memory-related performance issues. Expertise in diagnosing and resolving memory leaks or excessive memory consumption issues within the Oracle database. Deep understanding of how shared memory segments are managed within the Linux OS on GCP Compute Engine and how to optimize them for Oracle. Data Guard Expertise: Deep understanding of Oracle Data Guard architectures (Maximum Performance, Maximum Availability, Maximum Protection). Expertise in configuring and managing Data Guard broker for automated switchovers and failovers. Experience in troubleshooting Data Guard issues and ensuring data consistency. Knowledge of Data Guard best practices for performance and reliability. Proficiency in performing Data Guard role transitions (switchover, failover) with minimal downtime. Experience with Active Data Guard is a plus. Operating System Tuning: Deep expertise in Linux operating systems (e.g., Oracle Linux, Red Hat, CentOS) and their interaction with Oracle databases. Performance tuning of the Linux operating system for optimal Oracle database performance, including: Kernel parameter tuning (e.g., shared memory settings, semaphores, file descriptor limits). Memory management optimization (e.g., HugePages configuration). I/O subsystem tuning (e.g., disk scheduler selection, filesystem optimization). Network configuration optimization (e.g., TCP/IP parameters). Monitoring and analysis of OS performance metrics using tools like vmstat, iostat, top, and sar. Identifying and resolving OS-level resource contention issues (CPU, memory, I/O). Good to Have: GCP Environment Management: Provision and manage GCP Compute Engine instances for Oracle databases, including selecting appropriate instance types and storage configurations. Configure and manage GCP networking components (VPCs, subnets, firewalls) for secure database access. Utilize GCP Cloud Monitoring and Logging for database monitoring and troubleshooting. Implement and manage GCP Cloud Storage for database backups. Experience with Infrastructure as Code (IaC) tools like Terraform or Cloud Deployment Manager to automate GCP resource provisioning. Cost optimization of Oracle database infrastructure on GCP. Other Products and Platforms Experience with other cloud platforms (AWS, Azure). Experience with NoSQL databases. Experience with Agile development methodologies. Experience with DevOps practices and tools (e.g., Ansible, Chef, Puppet). Experience with GoldenGate. Qualifications: Bachelor's degree in Computer Science or a related field. Minimum 12+ years of experience as an Oracle DBA. Proven experience managing Oracle 12c, 18c, 19c, and 21c single instance with DG and RAC databases in a production environment, with strong Data Guard expertise. Extensive experience with Oracle database performance tuning, including OS-level and database parameter optimization. Hands-on experience with Oracle databases hosted on Google Cloud Platform (GCP). Strong understanding of Linux operating systems. Excellent troubleshooting and problem-solving skills. Strong communication and collaboration skills. Oracle Certified Professional (OCP) certification is highly preferred. GCP certifications (e.g., Cloud Architect, Cloud Engineer) are a plus. Aeris may conduct background checks to verify the information provided in your application and assess your suitability for the role. The scope and type of checks will comply with the applicable laws and regulations of the country where the position is based. Additional detail will be provided via the formal application process. Aeris walks the walk on diversity. We’re a brilliant mix of varying ethnicities, religions, cultures, sexual orientations, gender identities, ages and professional/personal/military experiences – and that’s by design. Diverse perspectives are essential to our culture, innovative process and competitive edge. Aeris is proud to be an equal opportunity employer. VV8Ow9JB0S

Posted 6 days ago

Apply

5.0 years

0 Lacs

Noida

On-site

At ATS HomeKraft, we are continuously growing and expanding our portfolio, as a result, we are always on the lookout for talented candidates to strengthen our team of passionate people If you share the same passion as we do, we would like to hear from you! Send us your info today at careers@homekraft.in Supply Chain Management (Contracts) Reports To: Assistant General Manager | Job Level: DM & above | Position Type: Full time | Location: Noida, India Position Summary: The main responsibilities for this role include preparation of BOQ for all specialized works, floating the tenders and receiving the offer, Cross verification, preparation of comparative statements, etc. Minimum Requirements: Min. 5 years into contracts preferably from construction or real estate company Experience of finalizing contracts of civil, finishing, services, and infrastructure works Should have proficiency in English – written and spoken. Work Order preparation in ERP & SAP. Excellent knowledge of Excel and power point. Key Responsibilities: Handling tendering process for all specialized works in Residential & Commercial projects. Preparation of contract document as per the nature of job. Providing the updated tracker on daily basis. Vendor Management and introduction of new agencies for getting the best commercials. Settlement of the Extra items in claims in respect to the project. Acting like a bridge between management & vendors. Following Value Engineering approach. Sites visit as required. Attending project review meetings. Negotiating the rates for the work package for which the tender is floated. Preparing the comparison statement that defines rates. Evaluating the performance of the vendor on pre decided time intervals. Preparing the evaluated reports for the Sr. Mgt. Work Order preparation in ERP Farvision. Giving the MIS inputs to the sr. mgt on monthly basis. Coordination with design dept. and site for fine tuning the tender document. Key Performance Measures: Contract documents and work order. Rate analysis. Award for work (issue of work order). Homekraft values its highly talented employees and offers the ideal climate for innovative, motivated, and proactive individuals with diverse backgrounds. About ATS Homekraft ATS Homekraft is a modern take on Real Estate. We do not just build houses but deliver homes which are value for money and encapsulates the need of a modern family. Being an ATS company, we take pride in the quality that we deliver. One of our primary focus areas is quality of construction which is backed by the vision of creating value for the perspective homeowner. The team comprises of highly experienced professionals and our systems and SOP are designed to create complete transparency in all transactions. Please find us at www.homekraft.in https://www.facebook.com/HomeKraftInfra/, https://www.linkedin.com/company/homekraft-infra/ Email at: careers@homekraft.in

Posted 6 days ago

Apply

1.0 years

7 Lacs

Noida

On-site

WAF (Radware) L1 and L2 Analyst Location: Noida, India Experience Required: L1: 1–3 years L2: 3–5 years Job Type: Full-Time /On-site/Hybrid Key Responsibilities: L1 Responsibilities: Monitor Web Application Firewall (Radware) alerts and logs.Perform initial triage and basic troubleshooting of security incidents. Escalate complex issues to L2 team with detailed incident documentation. Regularly check policy violations and suggest tuning recommendations. Perform health checks of WAF systems and ensure uptime. Maintain shift handover logs and ensure seamless communication. L2 Responsibilities: Manage WAF policy configurations and rule tuning for Radware WAF. Analyze web traffic and logs to detect and mitigate application-layer attacks (OWASP Top 10). Collaborate with application and network teams to implement protection strategies. Conduct RCA (Root Cause Analysis) of incidents and fine-tune policies to reduce false positives. Lead WAF upgrades, patching, and performance tuning. Provide mentorship and support to L1 analysts. Skills & Qualifications: Strong understanding of Radware AppWall or equivalent WAF platforms.Familiarity with HTTP/HTTPS, SSL certificates, DNS, Load Balancers, and Web Servers. Experience with threat analysis and mitigation of SQLi, XSS, CSRF, etc. Working knowledge of ITIL processes and incident management tools (like ServiceNow). Hands-on with packet analysis tools (e.g., Wireshark) is a plus. Security certifications such as CEH, CompTIA Security+, or vendor-specific Radware certifications (preferred). Job Type: Full-time Pay: Up to ₹700,000.00 per year Schedule: Day shift Supplemental Pay: Performance bonus Application Question(s): How many years of experience in WAF (Radware)? Are you an Immediate Joiner? Experience: minimum: 1 year (Required) Location: Noida, Uttar Pradesh (Required) Work Location: In person

Posted 6 days ago

Apply

0 years

3 Lacs

Calcutta

Remote

What You'll Do Build AI/ML technology stacks from concept to production, including data pipelines, model training, and deployment. Develop and optimize Generative AI workflows, including prompt engineering, fine-tuning (LoRA, QLoRA), retrieval-augmented generation (RAG), and LLM-based applications. Work with Large Language Models (LLMs) such as Llama, Mistral, and GPT, ensuring efficient adaptation for various use cases. Design and implement AI-driven automation using agentic AI systems and orchestration frameworks like Autogen, LangGraph, and CrewAI. Leverage cloud AI infrastructure (AWS, Azure, GCP) for scalable deployment and performance tuning. Collaborate with cross-functional teams to deliver AI-driven solutions. Job Types: Part-time, Contractual / Temporary Contract length: 2 months Pay: From ₹25,000.00 per month Expected hours: 40 per week Schedule: Day shift Work Location: Remote

Posted 6 days ago

Apply

2.0 - 7.0 years

8 Lacs

Jaipur

On-site

JPLoft is offering a Senior Blockchain Developer job in Jaipur. You'll oversee all Blockchain project development and implementation, ensuring top-quality solutions through rigorous testing. This role also involves market research on new Blockchain technologies and trends. You'll coordinate with cross-functional teams, define technical specifications, and mentor junior developers, driving our innovative Blockchain initiatives. Key Responsibilities: Leadership and Team Management: Lead and mentor a team of mobile application developers, fostering a collaborative and innovative environment to drive excellence in development practices. Mobile Application Development: Spearheaded the design, development, and deployment of high-quality blockchain ecosystems utilizing best practices and the latest technologies. Strategic Planning and Execution: Develop and execute strategic plans for mobile application development, aligning with company objectives and market trends to maintain a competitive edge. Innovation and Continuous Improvement: Drive innovation in blockchain application development processes, staying abreast of emerging technologies and industry trends to incorporate new features and functionalities into our applications. Quality Assurance and Performance Optimization: Implement rigorous testing methodologies to ensure the reliability, security, and performance of blockchain applications, optimizing them for speed, scalability, and user satisfaction. Collaboration and Communication: Collaborate closely with cross-functional teams including product managers, designers, and QA engineers to ensure seamless integration of mobile applications with other systems and services. Research and Development: Conduct research and experimentation to explore new technologies and methodologies that can enhance our mobile applications, contributing to the evolution of our product offerings. Documentation and Compliance: Ensure thorough documentation of mobile application development processes, adhering to industry standards, regulatory requirements, and best practices. Troubleshooting and Support: Provide technical support and troubleshooting expertise for mobile applications, addressing issues promptly to maintain optimal user experience and satisfaction. Performance Monitoring and Analysis: Monitor the performance and usage metrics of mobile applications, conducting data analysis to identify opportunities for optimization and enhancement. Role & Responsibilities: Bachelor's/Master's degree in computer science, Information Technology, or related field. Proficient in Blockchain technologies, including Ethereum, Hyperledger, Corda, and Solidity. In-depth understanding of Smart Contracts, Cryptography, and Consensus Algorithms. Ability to work independently and manage multiple tasks simultaneously. Excellent communication and teamwork skills. Experience & Exposure: Total Experience: 2 - 7 Years of experience in Blockchain development, with a proven track record in developing and implementing Blockchain projects. Golang experience. Verifiable Credentials- DID, IPFS Blockchain Fabric, Hyperledger Indy, Aries. Design and implement blockchain-based solutions using Hyperledger Fabric. Should have Hyperledger Fabric-based Blockchain implementation experience, Chain code creation, and NodeJS-based APIs creation. Deep knowledge of smart contracts including testing, implementation, and scaling. Must have experience with the development of RESTful web services. Knowledge & experience of Python Must have experience with database design and development. Significant experience with MongoDB, PostgreSQL, MySQL and GraphQL. Deployment at production in AWS managed Blockchain in a private cloud. Operating knowledge of AWS. Strong enthusiasm for technology, with up to date on current tools and best practices around development and software management. Experience of using Docker to containerize application. Knowledge & experience of Python Knowledge & experience of Golang Knowledge & experience in RabbitMQ (message queueing agent) Knowledge & experience in AWS Lambda Functions Software Development & Engineering Experience: Knowledge of architectural design patterns, performance tuning, database and functional designs Hands-on experience in Service Oriented Architecture Ability to lead solution development and delivery for the design solutions Experience in designing high-level and low-level documents is a plus A good understanding of SDLC is a pre-requisite Awareness of the latest technologies and trends Logical thinking and problem-solving skills along with an ability to collaborate Should be able to use design patterns to make the application reliable, scalable, and highly available Should be able to design Microservices and Serverless based architecture Work with client architects and define top-notch solutions. Perks & Benefits You Can Count On We offer more than just a typical work experience. It’s benefits and perks designed to support your job & life. Celebration Time-Off Play Zone & Cafeteria Open Culture Competitive Salary On-Site Medical Room Flexible Leave Policies Festival & Birthday Celebrations Dedicated Pantry Area Wellness Programs Training Sessions Learning & Development Performance Rewards Work-Life Balance Support Culture of Appreciation Welcoming Onboarding Friendly Work Environment Why You’ll Love Working at JPLoft? Innovation at Our Core We thrive on fresh ideas and bold thinking. Your creativity won’t just be welcomed—it’ll be celebrated, challenged, and transformed into real-world solutions that make a difference. Grow at Your Own Pace We’re all about leveling up. Whether it’s new skills, leadership opportunities, or exciting projects, you’ll find plenty of ways to push your boundaries and grow. A Team That Feels Like Family Collaboration is key and so is fun. Join a supportive crew that cheers your wins, backs you through challenges, and makes every workday feel like a shared adventure. Work-Life Balance That Works We get it, life happens outside the office. That’s why we support a healthy work-life balance, so you can be your best self, both on and off the clock. Meaningful Work That Matters Your job here isn’t just a paycheck, it’s a chance to make an impact. Help us build innovative solutions that improve lives and shape the future. Perks That Put a Smile From team events to wellness programs and thoughtful benefits, we take care of our people. Because when you’re happy, great things happen.

Posted 6 days ago

Apply

2.0 years

5 - 7 Lacs

Visakhapatnam

On-site

Role Overview: We are seeking a talented and detail-oriented Database Developer with 2+ years of experience to design, develop, and maintain scalable database solutions. The ideal candidate should have a strong command over SQL and be experienced in writing efficient queries, stored procedures, and working with data models to support application and reporting needs. Key Responsibilities: Write and optimize SQL queries, stored procedures, functions, views, and triggers Design and maintain normalized and denormalized data models Develop and maintain ERP Processes Analyze existing queries for performance improvements and suggest indexing strategies Work closely with application developers and analysts to understand data requirements Ensure data integrity and consistency across development, staging, and production environments Create and maintain technical documentation related to database structures, processes, and queries Generate and support custom reports and dashboards (using tools like Superset Etc) Participate in data migration and integration efforts between systems or platforms Work with large datasets and ensure optimal data processing and storage Required Skills: Strong hands-on experience with SQL Server, MySQL, PostgreSQL. Proficiency in writing complex SQL queries, stored procedures, and data transformations Understanding of relational database concepts, data modeling, and indexing Knowledge of performance tuning techniques (joins, temp tables, query plans) Familiarity with ERP tools or scripting . Preferred Qualifications: Bachelor’s degree in Computer Science, Information Systems, or related field MS SQL, good to have dot net. knowledge on WMS or MEW or manufacturing ERP experience. Knowledge of basic database security, transactions, and locking mechanisms Exposure to cloud-based databases Experience with version control (Git), Agile methodologies, or similar tools Nice to Have: Experience working in domains like retail, supply chain, Warehouse , healthcare, or e-commerce Send Resume to : sowmya.chintada@inventrax.com (or) janardhan.tanakala@inventrax.com· Job Types: Full-time, Permanent Pay: ₹500,000.00 - ₹700,000.00 per year Benefits: Provident Fund Experience: total work: 3 years (Preferred) Location: Visakhapatnam, Andhra Pradesh (Preferred) Work Location: In person

Posted 6 days ago

Apply

2.0 years

0 Lacs

Andhra Pradesh

On-site

We are seeking an experienced and innovative Generative AI Developer to join our AWAC team. In this role, you will lead the design and development of GenAI and Agentic AI applications using state of the art LLMs and AWS native services. You will work on both R&D focused proofof concepts and production grade implementations, collaborating with cross-functional teams to bring intelligent, scalable solutions to life. Key Responsibilities Design, develop, and deploy Generative AI and Agentic AI applications using LLMs such as Claude, Cohere, Titan, and others. Lead the development of proof of concept (PoC) solutions to explore new use cases and validate AI driven innovations. Architect and implement retrieval augmented generation (RAG) pipelines using LangChain and Vector Databases like OpenSearch. Integrate with AWS services including Bedrock API, SageMaker, SageMaker JumpStart, Lambda, EKS/ECS, Amazon Connect, Amazon Q. Apply few shot, one shot, and zero shot learning techniques to fine tune and prompt LLMs effectively. Collaborate with data scientists, ML engineers, and business stakeholders to translate complex requirements into scalable AI solutions. Implement CI/CD pipelines, infrastructure as code using Terraform, and follow DevOps best practices. Optimize performance, cost, and reliability of AI applications in production environments. Document architecture, workflows, and best practices to support knowledge sharing and onboarding. Required Skills & Technologies Experience in Python development, with at least 2 years in AI/ML or GenAI projects. Strong hands on experience with LLMs and Generative AI frameworks. Proficiency in LangChain, Vector DBs (e.g OpenSearch), and prompt engineering. Deep understanding of AWS AI/ML ecosystem: Bedrock, SageMaker, Lambda, EKS/ECS. Experience with serverless architectures, containerization, and cloud native development. Familiarity with DevOps tools: Git, CI/CD, Terraform. Strong debugging, performance tuning, and problem solving skills. Preferred Qualifications Experience with Amazon Q, Amazon Connect, or Amazon Titan. Familiarity with Claude, Cohere, or other foundation models. Bachelors or Master s degree in Computer Science, AI/ML, or a related field. Experience in building agentic workflows and multi agent orchestration is a plus. About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.

Posted 6 days ago

Apply

3.0 years

0 Lacs

Kanayannur, Kerala, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics D&A – SSIS- Senior We’re looking for Informatica or SSIS Engineers with Cloud Background (AWS, Azure) Primary skills: Has played key roles in multiple large global transformation programs on business process management Experience in database query using SQL Should have experience working on building/integrating data into a data warehouse. Experience in data profiling and reconciliation Informatica PowerCenter/IBM-DataStage/ SSIS development Strong proficiency in SQL/PLSQL Good experience in performance tuning ETL workflows and suggest improvements. Developed expertise in complex data management or Application integration solution and deployment in areas of data migration, data integration, application integration or data quality. Experience in data processing, orchestration, parallelization, transformations and ETL Fundamentals. Leverages on variety of programming languages & data crawling/processing tools to ensure data reliability, quality & efficiency (optional) Experience in Cloud Data-related tool (Microsoft Azure, Amazon S3 or Data lake) Knowledge on Cloud infrastructure and knowledge on Talend cloud is an added advantage Knowledge of data modelling principles. Knowledge in Autosys scheduling Good experience in database technologies. Good knowledge in Unix system Responsibilities: Need to work as a team member to contribute in various technical streams of Data integration projects. Provide product and design level technical best practices Interface and communicate with the onsite coordinators Completion of assigned tasks on time and regular status reporting to the lead Building a quality culture Use an issue-based approach to deliver growth, market and portfolio strategy engagements for corporates Strong communication, presentation and team building skills and experience in producing high quality reports, papers, and presentations. Experience in executing and managing research and analysis of companies and markets, preferably from a commercial due diligence standpoint. Qualification: BE/BTech/MCA (must) with an industry experience of 3 -7 years. Experience in Talend jobs, joblets and customer components. Should have knowledge of error handling and performance tuning in Talend. Experience in big data technologies such as sqoop, Impala, hive, Yarn, Spark etc. Informatica PowerCenter/IBM-DataStage/ SSIS development Strong proficiency in SQL/PLSQL Good experience in performance tuning ETL workflows and suggest improvements. Atleast experience of minimum 3-4 clients for short duration projects ranging between 6-8 + months OR Experience of minimum 2+ clients for duration of projects ranging between 1-2 years or more than that People with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 6 days ago

Apply

6.0 years

0 Lacs

Kanayannur, Kerala, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Data Integration Specialist – Senior The opportunity We are seeking a talented and experienced Integration Specialist with 3–6 years of experience to join our growing Digital Integration team. The ideal candidate will play a pivotal role in designing, building, and deploying scalable and secure solutions that support business transformation, system integration, and automation initiatives across the enterprise. Your Key Responsibilities Work with clients to assess existing integration landscapes and recommend modernization strategies using MuleSoft. Translate business requirements into technical designs, reusable APIs, and integration patterns. Develop, deploy, and manage MuleSoft APIs and integrations on Anypoint Platform (CloudHub, Runtime Fabric, Hybrid). Collaborate with business and IT stakeholders to define integration standards, SLAs, and governance models. Implement error handling, logging, monitoring, and alerting using Anypoint Monitoring and third-party tools. Maintain integration artifacts and documentation, including RAML specifications, flow diagrams, and interface contracts. Ensure performance tuning, scalability, and security best practices are followed across integration solutions. Support CI/CD pipelines, version control, and DevOps processes for MuleSoft assets using platforms like Azure DevOps or GitLab. Collaborate with cross-functional teams (Salesforce, SAP, Data, Cloud, etc.) to deliver end-to-end connected solutions. Stay current with MuleSoft platform capabilities and industry integration trends to recommend improvements and innovations. Troubleshoot integration issues and perform root cause analysis in production and non-production environments. Contribute to internal knowledge-sharing, technical mentoring, and process optimization. Strong SQL, data integration and handling skills Exposure to AI Models ,Python and using them in Data Cleaning/Standardization. To qualify for the role, you must have 3–6 years of hands-on experience with MuleSoft Anypoint Platform and Anypoint Studio Strong experience with API-led connectivity and reusable API design (System, Process, Experience layers). Proficient in DataWeave transformations, flow orchestration, and integration best practices. Experience with API lifecycle management including design, development, publishing, governance, and monitoring. Solid understanding of integration patterns (synchronous, asynchronous, event-driven, batch). Hands-on experience with security policies, OAuth, JWT, client ID enforcement, and TLS. Experience in working with cloud platforms (Azure, AWS, or GCP) in the context of integration projects. Knowledge of performance tuning, capacity planning, and error handling in MuleSoft integrations. Experience in DevOps practices including CI/CD pipelines, Git branching strategies, and automated deployments. Experience in data intelligence cloud platforms like Snowflake, Azure, data bricks Ideally, you’ll also have MuleSoft Certified Developer or Integration Architect certification. Exposure to monitoring and logging tools (e.g., Splunk, Elastic, Anypoint Monitoring). Strong communication and interpersonal skills to work with technical and non-technical stakeholders. Ability to document integration requirements, user stories, and API contracts clearly and concisely. Experience in agile environments and comfort working across multiple concurrent projects. Ability to mentor junior developers and contribute to reusable component libraries and coding standards. What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career. The freedom and flexibility to handle your role in a way that’s right for you. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 6 days ago

Apply

4.0 years

0 Lacs

Andhra Pradesh, India

On-site

Job Title: Data Engineer (4+ Years Experience) Location: Pan India Job Type: Full-Time Experience: 4+ Years Notice Period: Immediate to 30 days preferred Job Summary We are looking for a skilled and motivated Data Engineer with over 4+ years of experience in building and maintaining scalable data pipelines. The ideal candidate will have strong expertise in AWS Redshift and Python/PySpark, with exposure to AWS Glue, Lambda, and ETL tools being a plus. You will play a key role in designing robust data solutions to support analytical and operational needs across the organization. Key Responsibilities Design, develop, and optimize large-scale ETL/ELT data pipelines using PySpark or Python. Implement and manage data models and workflows in AWS Redshift. Work closely with analysts, data scientists, and stakeholders to understand data requirements and deliver reliable solutions. Perform data validation, cleansing, and transformation to ensure high data quality. Build and maintain automation scripts and jobs using Lambda and Glue (if applicable). Ingest, transform, and manage data from various sources into cloud-based data lakes (e.g., S3). Participate in data architecture and platform design discussions. Monitor pipeline performance, troubleshoot issues, and ensure data reliability. Document data workflows, processes, and infrastructure components. Required Skills 4+ years of hands-on experience as a Data Engineer. Strong proficiency in AWS Redshift including schema design, performance tuning, and SQL development. Expertise in Python and PySpark for data manipulation and pipeline development. Experience working with structured and semi-structured data (JSON, Parquet, etc.). Deep knowledge of data warehouse design principles including star/snowflake schemas and dimensional modeling. Good To Have Working knowledge of AWS Glue and building serverless ETL pipelines. Experience with AWS Lambda for lightweight processing and orchestration. Exposure to ETL tools like Informatica, Talend, or Apache Nifi. Familiarity with workflow orchestrators (e.g., Airflow, Step Functions). Knwledge of DevOps practices, version control (Git), and CI/CD pipelines. Preferred Qualifications Bachelor degree in Computer Science, Engineering, or related field. AWS certifications (e.g., AWS Certified Data Analytics, Developer Associate) are a plus.

Posted 6 days ago

Apply

2.0 years

0 Lacs

Andhra Pradesh, India

On-site

We are seeking an experienced and innovative Generative AI Developer to join our AWAC team. In this role, you will lead the design and development of GenAI and Agentic AI applications using state of the art LLMs and AWS native services. You will work on both R&D focused proofof concepts and production grade implementations, collaborating with cross-functional teams to bring intelligent, scalable solutions to life. Key Responsibilities Design, develop, and deploy Generative AI and Agentic AI applications using LLMs such as Claude, Cohere, Titan, and others. Lead the development of proof of concept (PoC) solutions to explore new use cases and validate AI driven innovations. Architect and implement retrieval augmented generation (RAG) pipelines using LangChain and Vector Databases like OpenSearch. Integrate with AWS services including Bedrock API, SageMaker, SageMaker JumpStart, Lambda, EKS/ECS, Amazon Connect, Amazon Q. Apply few shot, one shot, and zero shot learning techniques to fine tune and prompt LLMs effectively. Collaborate with data scientists, ML engineers, and business stakeholders to translate complex requirements into scalable AI solutions. Implement CI/CD pipelines, infrastructure as code using Terraform, and follow DevOps best practices. Optimize performance, cost, and reliability of AI applications in production environments. Document architecture, workflows, and best practices to support knowledge sharing and onboarding. Required Skills & Technologies Experience in Python development, with at least 2 years in AI/ML or GenAI projects. Strong hands on experience with LLMs and Generative AI frameworks. Proficiency in LangChain, Vector DBs (e.g OpenSearch), and prompt engineering. Deep understanding of AWS AI/ML ecosystem: Bedrock, SageMaker, Lambda, EKS/ECS. Experience with serverless architectures, containerization, and cloud native development. Familiarity with DevOps tools: Git, CI/CD, Terraform. Strong debugging, performance tuning, and problem solving skills. Preferred Qualifications Experience with Amazon Q, Amazon Connect, or Amazon Titan. Familiarity with Claude, Cohere, or other foundation models. Bachelors or Master s degree in Computer Science, AI/ML, or a related field. Experience in building agentic workflows and multi agent orchestration is a plus.

Posted 6 days ago

Apply

6.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Data Integration Specialist – Senior The opportunity We are seeking a talented and experienced Integration Specialist with 3–6 years of experience to join our growing Digital Integration team. The ideal candidate will play a pivotal role in designing, building, and deploying scalable and secure solutions that support business transformation, system integration, and automation initiatives across the enterprise. Your Key Responsibilities Work with clients to assess existing integration landscapes and recommend modernization strategies using MuleSoft. Translate business requirements into technical designs, reusable APIs, and integration patterns. Develop, deploy, and manage MuleSoft APIs and integrations on Anypoint Platform (CloudHub, Runtime Fabric, Hybrid). Collaborate with business and IT stakeholders to define integration standards, SLAs, and governance models. Implement error handling, logging, monitoring, and alerting using Anypoint Monitoring and third-party tools. Maintain integration artifacts and documentation, including RAML specifications, flow diagrams, and interface contracts. Ensure performance tuning, scalability, and security best practices are followed across integration solutions. Support CI/CD pipelines, version control, and DevOps processes for MuleSoft assets using platforms like Azure DevOps or GitLab. Collaborate with cross-functional teams (Salesforce, SAP, Data, Cloud, etc.) to deliver end-to-end connected solutions. Stay current with MuleSoft platform capabilities and industry integration trends to recommend improvements and innovations. Troubleshoot integration issues and perform root cause analysis in production and non-production environments. Contribute to internal knowledge-sharing, technical mentoring, and process optimization. Strong SQL, data integration and handling skills Exposure to AI Models ,Python and using them in Data Cleaning/Standardization. To qualify for the role, you must have 3–6 years of hands-on experience with MuleSoft Anypoint Platform and Anypoint Studio Strong experience with API-led connectivity and reusable API design (System, Process, Experience layers). Proficient in DataWeave transformations, flow orchestration, and integration best practices. Experience with API lifecycle management including design, development, publishing, governance, and monitoring. Solid understanding of integration patterns (synchronous, asynchronous, event-driven, batch). Hands-on experience with security policies, OAuth, JWT, client ID enforcement, and TLS. Experience in working with cloud platforms (Azure, AWS, or GCP) in the context of integration projects. Knowledge of performance tuning, capacity planning, and error handling in MuleSoft integrations. Experience in DevOps practices including CI/CD pipelines, Git branching strategies, and automated deployments. Experience in data intelligence cloud platforms like Snowflake, Azure, data bricks Ideally, you’ll also have MuleSoft Certified Developer or Integration Architect certification. Exposure to monitoring and logging tools (e.g., Splunk, Elastic, Anypoint Monitoring). Strong communication and interpersonal skills to work with technical and non-technical stakeholders. Ability to document integration requirements, user stories, and API contracts clearly and concisely. Experience in agile environments and comfort working across multiple concurrent projects. Ability to mentor junior developers and contribute to reusable component libraries and coding standards. What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career. The freedom and flexibility to handle your role in a way that’s right for you. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 6 days ago

Apply

5.0 years

0 Lacs

Maharashtra, India

On-site

Namaskaram! We are seeking a highly skilled and experienced RF Hardware Engineer to join our innovative hardware team. As an RF expert, you will be responsible for the end-to-end design, development, testing, and validation of RF systems and components for our next-generation products. You will work closely with cross-functional teams including antenna, digital, and mechanical engineers to ensure high-performance, robust RF solutions. We're also proud to share that Lenskart is now our strategic investor , a milestone that reflects the impact, potential, and purpose of the path we're walking. Join us as we co-create the future of conscious technology. Read more here: The smartphone era is peaking. The next computing revolution is here. Key Responsibilities: Design and develop RF circuits including LNA, PA, mixers, filters, baluns, and matching networks Perform RF simulations using tools like ADS, HFSS, CST, and EMPro Schematic capture and review PCB layout for RF modules with attention to high-frequency signal integrity and EMC Conduct RF performance tuning, calibration, and optimization for Wi-Fi, Bluetooth, LTE/5G, GNSS, and custom wireless systems Execute lab testing, characterization, and validation using VNA, spectrum analyzers, signal generators, and network analyzers Collaborate with antenna and mechanical teams for RF integration, co-design, and mitigation of interference and defence issues Ensure compliance with regulatory standards (FCC, CE, ETSI, etc.) and support certification testing Drive root-cause analysis and resolution of RF-related issues during design, NPI, and field deployment stages Document design specifications, test plans, reports, and design reviews Qualifications: Bachelor's or Master's degree in Electrical Engineering, Electronics, or a related field 5+ years of hands-on experience in RF hardware design and development Strong knowledge of RF fundamentals, transmission line theory, impedance matching, and wireless communication systems Proficient in simulation tools (e.g., Keysight ADS, Ansys HFSS/CST) and PCB design tools (e.g., Altium, Cadence Allegro) Experienced in lab instrumentation and RF measurements Solid understanding of wireless protocols: Bluetooth, Wi-Fi, LTE, 5G, GNSS, etc Strong debugging skills and experience with EMC/EMI mitigation Excellent communication and documentation skills Experience with high-volume consumer electronics is a plus Nice to Have: Experience with WiFi, BLE, mmWave and phased array systems Familiarity with MIMO, beamforming, and RF front-end module integration Prior work in wearable, IoT, or AR/VR products

Posted 6 days ago

Apply

6.0 years

0 Lacs

Itanagar, Arunachal Pradesh, India

Remote

Senior Full-Stack Developer - Location: Remote Remote India Remote Remote India Job Type: Regular Full-time Division: Precision AQ Business Unit: Product Solutions Requisition Number: 5896 Position Summary We are seeking a highly skilled and experienced Senior Full-Stack Developer to join our cross-functional product development team. This role is central to building scalable, high-performance AI-powered social and digital listening applications. You will work closely with solution architects, product managers, and data scientists to bring innovative analytics platforms to life using modern technologies on the Azure cloud. Key Responsibilities Design, develop, and maintain full-stack applications using any Front-end frameworks (Preferred:- Vue or React) and any API frameworks (Preferred :- Python/fastapi/Flask or Dotnet). Collaborate with architects and product teams to translate business requirements into scalable and performant technical solutions. Build and optimize APIs and data pipelines to support real-time and batch processing of social and digital data. Ensure application performance, scalability, and security across the stack. Implement CI/CD pipelines and automated testing strategies using Azure DevOps or similar tools. Participate in code reviews, mentor junior developers, and contribute to best practices and architectural decisions. Monitor and troubleshoot production systems, ensuring high availability and reliability. Stay current with emerging technologies and propose innovative solutions to enhance product capabilities. Required Qualifications Bachelor’s degree in Computer Science, Engineering, or related field—or equivalent experience. 6+ years of professional experience in full-stack development. Proficiency in JavaScript/TypeScript and modern front-end frameworks, especially Vue.js. using any Front-end frameworks (Preferred:- Vue or React) and any API frameworks (Preferred :- Python/fastapi/Flask or Dotnet) Experience integrating and deploying LLMs and other AI/ML models into production applications. Solid understanding of SQL Server and data modeling for high-volume applications. Experience with RESTful APIs, microservices architecture, and asynchronous processing. Familiarity with DevOps practices, CI/CD pipelines, and infrastructure-as-code. Strong problem-solving skills and ability to work in a fast-paced, agile environment. Preferred Qualifications Experience with social media APIs and digital listening platforms. Exposure to Generative AI and prompt engineering. Familiarity with performance tuning and cost optimization in Azure. Background in healthcare, life sciences, or analytics-driven product development. It has come to our attention that some individuals or organizations are reaching out to job seekers and posing as potential employers presenting enticing employment offers. We want to emphasize that these offers are not associated with our company and may be fraudulent in nature. Please note that our organization will not extend a job offer without prior communication with our recruiting team, hiring managers and a formal interview process. Apply Now

Posted 6 days ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Must have Strong Postgres DB Knowledge . Writing procedures and functions ,Writing dynamic code , Performance tuning in PostgreSQL and complex queries , UNIX. Good to have : IDMC or any other ETL tool knowledge, Airflow DAG , python , MS calls.

Posted 6 days ago

Apply

2.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Job Title: Architect – Gen AI Experience Range: 2-5Years Location: Noida Job Summary We are seeking a hands-on Technical Architect specializing in Generative AI to lead the design and implementation of AI-driven architectures across complex enterprise systems. This role combines deep coding expertise in Python, .NET Core, Angular, and Azure, with a strong architectural mindset to modernize platforms, drive efficiency, and deliver intelligent automation through AI/ML solutions. This is a high-impact role that involves coding, PoC development, AI architecture leadership, and direct collaboration with cross-functional teams and stakeholders. Key Responsibilities Hands-On Development Design and develop GenAI models for real-world use cases (e.g., predictive analytics, document summarization, intelligent workflows). Build scalable .NET Core Web APIs, Angular/React frontends, and Azure-native applications (Functions, Logic Apps, Azure SQL). Develop AI accelerators including agentic workflows, RAG pipelines, and LLM-integrated solutions. Create and iterate on proof-of-concepts (PoCs) aligned with business use cases. Architecture & Delivery Define and evolve the Gen AI roadmap for SDLC integration. Architect event-driven, cloud-native systems using Azure tools and messaging services like Kafka/Event Grid. Implement robust CI/CD, security, and code quality practices (e.g., SonarQube, guiderails, IaC with Bicep/ARM). Team & Stakeholder Collaboration Work alongside fellow architects to align technical strategy with enterprise goals. Partner with business stakeholders to translate workflows into intelligent, automated solutions. Conduct code reviews, mentor developers, and troubleshoot high-priority issues across the stack. Present GenAI solutions and ROI in customer architecture review boards (ARBs). Technical Skill Mandates Languages & Frameworks: Python (AI/LLM workflows, model integration) .NET Core (API development, cloud-native patterns) Angular (v13+) or React (TypeScript-based frontend development) Cloud & AI: Microsoft Azure Cognitive Services AI Foundry Logic Apps Azure SQL Power Automate AI Builder LLM Techniques: Prompt engineering, fine-tuning, RAG architecture MLOps practices for model deployment CI/CD & DevOps: Azure DevOps, GitHub Actions Infrastructure-as-Code (ARM/Bicep) Architecture Patterns: Event-driven systems (Kafka, Azure Event Grid) Modular, secure, and scalable microservices Qualifications & Experience 13–16 years of total experience in enterprise software, including: 5+ years in AI/ML architecture and cloud-native system design Demonstrated ability to code, architect, and lead AI-powered solutions Bachelor's or Master's in Computer Science, Engineering, or related field Preferred certifications: Azure Solutions Architect Expert, AWS/GCP Cloud Architect Preferred Domain Experience Exposure to domains like insurance, finance, or regulated industries with workflows like: Underwriting, claims processing, compliance automation What You’ll Gain Opportunity to lead GenAI strategy in a tech-forward enterprise A collaborative, agile environment focused on innovation and delivery Career development and learning opportunities in AI and cloud technologies

Posted 6 days ago

Apply

5.0 years

0 Lacs

Sonipat, Haryana, India

On-site

Job Description:- We are looking for an Assistant Manager - IT for our client place based at Sonipat, Haryana. As an Assistant Manager Information System, you will be responsible for overseeing the daily operation and management of IT systems within the organisation. You will support system improvements, ensure the integrity of data, manage upgrades, and collaborate with various teams to solve technical problems. Roles & Responsibilities:- Oversee the maintenance and operation of IT systems. Oversee the installation, configuration, maintenance and support of IT infrastructure, including servers, networks, and applications. Support system upgrades and enhancements. Stay up-to-date with the latest SAP technologies and best practices. Optimize and enhance existing SAP applications. Assist in the planning and execution of SAP upgrades and migrations. Work closely with functional teams to understand their requirements. Ensure systems are running efficiently and effectively. Monitor and analyse system performance, identifying and addressing any potential issues proactively. Provide technical support and guidance to other team members. Ensure data integrity and security across all platforms. Collaborate with different departments to address and solve system-related issues. Manage and mentor junior IT staff. Implement and uphold IT policies and procedures. Monitor system performance and generate reports for senior management. Skills & Others Qualifications:- Bachelor’s degree in Computer Science, Information Technology, or related field. Experience with performance tuning and optimisation of SAP programs 5+ years of experience in IT management or similar role. Strong understanding of information systems and data management. Experience with system analysis and improving IT infrastructure. Excellent problem-solving and critical-thinking skills. Strong leadership abilities and team management experience.

Posted 6 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies