Jobs
Interviews

6872 Performance Tuning Jobs - Page 5

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

0 Lacs

indore, madhya pradesh

On-site

Are you interested in developing cutting-edge software Are you entrepreneurial and passionate about your work We are looking for a self-motivated .NET developer to help grow our business and brand by building software that pushes technology to new heights. US market-focused Software Development Company is looking for an experienced MVC .NET Developer to join our Indore office. You will have the chance to push your development skills to the limit, expand your abilities and work on multiple projects as part of one of the best development teams in Indore. Strong hands-on experience in .Net MVC framework is a must-have for this position. Additionally, you should possess strong knowledge of Database design and writing Stored Procedures, Functions, Performing Query Optimization and Performance Tuning in SQL Server 2012/2014/2017/2019 Databases. Good knowledge of jQuery is also required. Experience working with AGILE SCRUM methodology is preferred along with knowledge of TFS/Git/Bitbucket. You should have good experience with building Web APIs and have a history of collaborating with team members at all levels for performance improvement and suggestions. This is a full-time job opportunity with an immediate or 15 days joining period. The mode of work is from the office. If you demonstrate dedication and value as a team member, you will have the opportunity to join our team permanently with a competitive salary and benefits.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

The Databricks Data Engineering Lead role requires a highly skilled individual who will architect and lead the implementation of scalable, high-performance data pipelines and platforms using the Databricks Lakehouse ecosystem. As a Data Engineering Lead, you will be responsible for managing a team of data engineers, establishing best practices, and collaborating with cross-functional stakeholders to unlock advanced analytics, AI/ML, and real-time decision-making capabilities. Your key responsibilities will include leading the design and development of modern data pipelines, data lakes, and lakehouse architectures using Databricks and Apache Spark. You will manage and mentor a team of data engineers, providing technical leadership and fostering a culture of excellence. Additionally, you will architect scalable ETL/ELT workflows to process structured and unstructured data from various sources (cloud, on-prem, streaming), build and maintain Delta Lake tables, and optimize performance for analytics, machine learning, and BI use cases. Collaboration with data scientists, analysts, and business teams to deliver high-quality, trusted, and timely data products is crucial. Ensuring best practices in data quality, governance, lineage, and security, including the use of Unity Catalog and access controls, will also be part of your responsibilities. Integration of Databricks with cloud platforms (AWS, Azure, or GCP) and data tools (Snowflake, Kafka, Tableau, Power BI, etc.) and implementation of CI/CD pipelines for data workflows using tools such as GitHub, Azure DevOps, or Jenkins are essential tasks. It is important to stay current with Databricks innovations and provide recommendations on platform strategy and architecture improvements. Qualifications for this role include a Bachelors or Masters degree in Computer Science, Data Engineering, or a related field. You should have at least 7+ years of experience in data engineering, including 3+ years working with Databricks and Apache Spark. Proven leadership experience in managing and mentoring data engineering teams is required. Proficiency in PySpark, SQL, and experience with Delta Lake, Databricks Workflows, and MLflow are necessary skills. A strong understanding of data modeling, distributed computing, and performance tuning is essential. Familiarity with one or more major cloud platforms (Azure, AWS, GCP) and cloud-native services, experience implementing data governance and security in large-scale environments, and familiarity with real-time data processing using Structured Streaming or Kafka are also expected. Knowledge of data privacy, security frameworks, compliance standards (e.g., PCIDSS, GDPR), exposure to machine learning pipelines, notebooks, and ML Ops practices are additional qualifications required. A Databricks Certified Data Engineer or equivalent certification is preferred.,

Posted 1 week ago

Apply

10.0 - 14.0 years

0 Lacs

karnataka

On-site

You will be joining 66degrees, a leading consulting and professional services company that specializes in developing AI-focused, data-led solutions leveraging the latest advancements in cloud technology. With our unmatched engineering capabilities and vast industry experience, we help the world's leading brands transform their business challenges into opportunities and shape the future of work. At 66degrees, we believe in embracing challenges and winning together. These values not only guide us in achieving our goals as a company but also for our people. We are dedicated to creating a significant impact for our employees by fostering a culture that sparks innovation and supports professional and personal growth along the way. As a Lead Software Developer, you will need to have 10-12 years of experience in backend, middleware, front-end development, and cloud platforms like Azure and GCP. The ideal candidate should have strong hands-on experience in designing scalable RESTful APIs using modern Python frameworks, developing responsive front-end applications using React and JavaScript, and deploying applications in cloud environments. Experience in integrating generative AI APIs (e.g., OpenAI, Vertex AI) is an added advantage. Your responsibilities will include designing, developing, and maintaining scalable backend and middleware solutions using Django, Flask, and FastAPI. You will also be responsible for building and exposing RESTful APIs for internal and external application integrations, developing responsive front-end applications using React, JavaScript (ES6+), HTML5, and CSS, working with ORMs like Django ORM and SQLAlchemy for efficient data handling, and deploying and managing applications on Azure and Google Cloud Platform (GCP). Collaboration with cross-functional teams to implement robust, secure, and scalable solutions, participation in architectural discussions, contribution to technical decision-making, code review, enforcement of best practices, mentoring junior developers, ensuring optimal performance, reliability, and security across the entire tech stack are also part of your responsibilities. Qualifications required for this role include 10-12 years of experience in backend development with Python, strong proficiency in Django, Flask, and FastAPI, solid front-end development experience using React, JavaScript, HTML5, and CSS, experience deploying and managing applications on Azure and/or GCP, a strong understanding of RESTful API development and integration, experience with ORMs like Django ORM and SQLAlchemy, a solid understanding of microservices architecture, modular design, and CI/CD pipelines, strong debugging, performance tuning, and code optimization skills, excellent communication and team leadership skills. Additionally, experience integrating Generative AI APIs such as OpenAI or Vertex AI, experience with Docker, Kubernetes, and modern CI/CD tools, knowledge of cloud-native design principles, and infrastructure-as-code (Terraform, ARM templates, etc.) are considered added advantages.,

Posted 1 week ago

Apply

3.0 - 23.0 years

0 Lacs

karnataka

On-site

We are seeking a creative and highly proficient AI Application Engineer to join our team and contribute to the development of cutting-edge enterprise-scale, public-facing AI applications. In this role, you will have the chance to influence the future generation of autonomous agent-based systems, with a focus on performance, scalability, and innovative technologies such as LLMs, embedding techniques, and agentic frameworks. As an AI Application Engineer, your primary responsibility will involve designing and implementing solutions that not only meet user expectations but also enhance system performance. This position demands a mix of profound technical expertise, ingenuity, and a dedication to delivering top-notch application engineering solutions. Your key responsibilities will include: - Designing, developing, and deploying enterprise-scale, public-facing AI applications. - Implementing advanced Retrieval Augmented Generation (RAG) architectures, encompassing hybrid search and multi-vector retrieval. - Building and enhancing systems to optimize token utilization, response caching, and performance tuning. - Developing and managing autonomous agent frameworks using LangGraph or a similar framework. - Leading innovation efforts in embedding techniques, contextual compression, and multi-agent systems. - Collaborating with diverse teams such as product managers, designers, and DevOps to ensure the development of robust and user-centric solutions. - Troubleshooting intricate technical issues and providing guidance to junior developers. - Staying abreast of the latest advancements in LLMs, agentic frameworks, and AI-driven application development. Minimum Qualifications: - 3 years of hands-on experience in developing production-grade LLM applications. - 3 to 5 years of overall software development experience. - Minimum 1 year of experience in autonomous agent development using the LangGraph framework or similar. Must-Have Skills: - Proficiency in Production-Grade LLM Development. - Hands-on experience in Autonomous Agent Development. - Expertise in Advanced RAG Architectures. - Strong skills in Prompt Engineering and Vector Search Optimization. - Understanding of Performance Tuning and Token Optimization. Nice-to-Have Skills: - Experience with Scalable Architectures, such as Kubernetes, Docker, and cloud platforms like Azure. - Familiarity with NoSQL Databases like MongoDB and ElasticSearch. - Knowledge of API-Driven Development. - Experience with CI/CD Pipelines and Agile Methodologies. - Strong Analytical and Problem-Solving Skills. About Milestone: Milestone has been a prominent provider of digital marketing software and services for location-based businesses for over 20 years. With over 2,000 clients in Hospitality, Retail, Financial Services, and Automotive industries, Milestone is trusted to drive their digital marketing strategies. The company has received numerous awards, including the Silver and Best in Category awards for its Digital Experience Platform and SEO-first CMS, showcasing its blend of digital marketing expertise and technological capabilities. Milestone is recognized as one of Silicon Valley Business Journal's fastest-growing companies and an Inc. 5,000 company.,

Posted 1 week ago

Apply

10.0 - 14.0 years

0 Lacs

noida, uttar pradesh

On-site

We are looking for an accomplished database Subject Matter Expert (SME) who is self-driven and possesses a strong combination of design skills, performance tuning/troubleshooting expertise specifically around EXADATA / GoldenGate. You should have experience working within large enterprise-class software product teams and deploying services on popular Cloud platforms like OCI, AWS, or Azure. As the ideal candidate, you must have the skills required for automations involving scripting, devops, and enhancing operational efficiency through automating key and routine operational activities. Your role will involve being the Oracle representative for the GBU product suite during customer calls and escalations, requiring SME technical skills, as well as excellent written and verbal communication. Additionally, you should possess effective convincing and negotiation skills. The successful candidate will be a self-motivated software professional with a minimum of 10 years of industry experience and the ability to handle complex problems independently. Prior experience in administering large-scale oracle databases, particularly Exadata and GoldenGate deployments, is necessary. You will play a crucial role in technical problem-solving and problem avoidance, addressing complex and critical customer issues. The focus will be on enhancing customer interaction, support, and engagement experiences while deflecting or avoiding service tickets. Preferred Qualifications: - Experience working in a solutions architect role for deployment/migration/upgrade of large customers on OCI Responsibilities: - Influence the design, capacity planning, tuning, and build out of Exadata DB instances on Oracle Cloud infrastructure. - Automate key operational activities to improve efficiency. - Administer middleware environments on any platform such as Oracle, including installation, configuration, migrations, tuning, patching, administration, and monitoring. - Ensure the smooth daily operation of production/middleware applications, offering administrative support for new implementations, upgrades, and migrations. - Keep the DB system up to date and patched automatically to the latest version. - Develop automated solutions for proactive DB health checks. - Collaborate with customers and developers to address priority situations promptly. - Contribute to the design of HA/DR solutions for the new Identity cloud service. - Participate in Performance, Scalability, and Reliability Analysis, including database query tuning. - Troubleshoot HA/DR and Database Performance issues on the existing cloud service. - Assist architects and development managers in defining requirements and detailed engineering analysis from a data requirements perspective. - Document database/schema design at the component and product level. - Engage in rapid delivery for Oracle Cloud projects, including planning, building, configuring, deploying, monitoring, and documenting. - Architect and design service deployment infrastructure using automation and the latest cloud capabilities to enhance agility, reliability, and observability. - Migrate existing On-Prem services to Oracle Cloud Infrastructure securely, leveraging the latest cloud services. - Troubleshoot and respond efficiently to major incidents, minimizing service disruptions, and addressing escalated issues promptly. - Provide expertise in cutting-edge products and technologies like Real Application Clusters, GoldenGate, High Availability, Data Guard, Corruption, Backup and Recovery, RMAN, Performance, Memory Management, Parallel query, Query tuning, Storage, ASM, Security, Networking, and Enterprise Manager. This position is at Career Level - IC4.,

Posted 1 week ago

Apply

1.0 - 5.0 years

0 Lacs

ahmedabad, gujarat

On-site

As a .NET Core Developer at SpeedBot in Ahmedabad, India, you will be part of an innovative algo trading platform that focuses on automated strategies and real-time decision-making. With 1-5 years of experience, you will leverage your expertise in .NET Core / C#, multithreading, low-latency system design, and API development to contribute to our dynamic team. Additionally, experience in market data feeds, order execution, and risk management will be a valuable asset. Your primary responsibilities will include developing and optimizing low-latency trading algorithms using .NET Core, integrating real-time market data feeds and order execution systems, implementing backtesting frameworks for strategy validation, ensuring high performance, scalability, and system reliability, and collaborating with quants and traders to refine trading logic. To excel in this role, you should have a familiarity with API integration, a strong understanding of .NET & C# Core concepts, proficiency in multithreading, concurrency, and performance tuning, knowledge of trading systems, market data protocols, and risk management, and exposure to cloud platforms such as Azure, AWS, as well as Docker/Kubernetes. While not mandatory, it would be advantageous if you have a background in trading fundamentals including market microstructure and execution strategies, experience with quantitative finance, machine learning, or data analysis, familiarity with Python for data analysis, and exposure to low-latency messaging systems like Kafka and RabbitMQ. At SpeedBot, you will enjoy a 5-day work week, an employee-first approach, a positive work environment, opportunities for skill enhancement through programs, growth prospects, monthly events/functions, annual appraisals, and access to a game lounge. Join us to be a part of our dynamic team and contribute to our cutting-edge trading platform.,

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

As a Senior Android Developer specializing in OpenGL/Vulkan/GLSL, you will be responsible for creating cutting-edge Android applications that emphasize graphics and performance. Your expertise in GPU rendering, graphics APIs, and shader programming will be crucial in developing visually stunning and high-performance features, with a special focus on VR and on-device machine learning. Your main responsibilities will include developing and enhancing Android apps with advanced graphics and rendering capabilities, utilizing OpenGL ES, Vulkan, or WebGL for real-time rendering, writing and optimizing shaders using GLSL, collaborating closely with designers and developers to ensure seamless visual experiences, troubleshooting graphics-related issues, and enhancing performance across various devices. It is essential to stay abreast of the latest Android graphics APIs and tools, integrate rendering with ML tools such as MediaPipe or TensorFlow Lite, explore graphics features for VR platforms like Oculus, and support testing to ensure smooth app operation on diverse devices. To qualify for this role, you should hold a B.E./B. Tech/M.S./M. Tech degree in Computer Science, Engineering, or a related field, along with a minimum of 4 years of experience in Android development. A solid understanding of GPU architecture and rendering pipelines, proficiency in OpenGL ES, Vulkan, or WebGL, expertise in writing GLSL shaders, and proficiency in Kotlin, including Coroutines and Flow, are essential requirements. Additionally, familiarity with Android SDKs, performance optimization, background tasks, Firebase, Google SDKs, and push notifications would be advantageous. Knowledge of Jetpack Compose and Crashlytics is considered a bonus, while experience in VR or game engines, as well as ML integration, would be beneficial. This position is based in Pune/Ahmedabad. If you are interested in joining our team, please submit your resume to careers@infocusp.com.,

Posted 1 week ago

Apply

1.0 - 5.0 years

0 Lacs

ahmedabad, gujarat

On-site

You should have 1 to 3 years of experience in IOS Native Development with proficiency in the Swift programming language. You must possess a solid understanding of the full mobile development life cycle and have experience working with UIKit, Auto Layout, Core Data, Core Animation, Core Graphics, etc. It is essential to have hands-on experience with RESTful APIs, JSON, networking, and asynchronous patterns (GCD, URLSession) as well as third-party libraries like Alamofire and Firebase. Staying updated with Apple's design principles and industry technologies is crucial, and you should be able to adapt to evolving frameworks and trends. You should also be familiar with architectural patterns such as MVC, MVVM, VIPER, and have exposure to SwiftUI. Knowledge of version control tools like Git is necessary, along with familiarity with performance tuning, offline storage, threading, and memory management. Strong problem-solving and debugging skills are a must, paired with excellent written and verbal communication skills. Collaborating with cross-functional teams to define, design, and ship new features is expected, ensuring the performance, quality, and responsiveness of applications. Your responsibilities will include working on UI/UX to create pixel-perfect user interfaces, integrating third-party libraries, APIs, and SDKs as required, maintaining code quality, organization, and automation. It would be a plus if you have experience with Objective-C, unit testing, and UI testing frameworks, as well as if you have published one or more iOS applications on the Apple App Store.,

Posted 1 week ago

Apply

16.0 - 20.0 years

0 Lacs

hyderabad, telangana

On-site

The key responsibilities for this role include: - Working closely with Module Leads and Product teams to understand product NFRs. - Providing data points to measure system performance and scale. - Identifying choke points in the system and proposing solutions. - Collaborating with the technology team to implement advancements and architectural changes. - Mentoring junior team members and enabling them to make informed decisions. - Participating in initiatives to re-architect sub-systems to support business growth. - Taking ownership of challenging areas and driving change in consultation with stakeholders. Key Skills required for this role are: - Extensive experience in building highly scalable systems. - 16+ years of development experience in Core Java Programming and related backend Java Frameworks like Spring. - Proficiency in distributed programming concepts and familiarity with messaging platforms like RabbitMQ, Kafka. - Strong problem-solving skills and knowledge of data structures. - Advanced concurrency skills in Java including understanding of locking mechanisms. - Proficiency in JDBC and relational database systems. - Hands-on experience in Java performance tuning, garbage collectors, and profilers. - Understanding of distributed design practices and willingness to continuously learn. - Good understanding of design patterns and their practical applications. - Advocate of Test Driven Development (TDD), self-driven, detail-oriented, and a strong team player. Overall, the ideal candidate for this role should possess a strong technical background, excellent collaboration skills, and a proactive approach to problem-solving and system optimization.,

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

delhi

On-site

You should have a B.E./B.Tech/MCA in Computer Science with at least 8 years of relevant experience in PostgreSQL Core DBA role. Your primary responsibility will be to manage PostgreSQL databases effectively. Your mandatory skills should include excellent knowledge of PostgreSQL architecture across various versions (10, 11, 12, 13, 14, and 15), ability to fix issues related to backup and recovery, proficiency in performance tuning and query optimization using tools like pgadmin, pgpool, repmgr, and troubleshooting database upgrades and migrations. You should also be able to install, configure, and troubleshoot replication tools and resolve issues related to Unix, storage, or network problems. Knowledge of VCS Cluster, Grafana, and kibana will be an added advantage. Moreover, you will be required to provide suggestions to application and development teams for better design, design database solutions, advise on database architecture, and collaborate with other infrastructure teams to build consistent systems. Additionally, you should have advanced knowledge of cluster concepts and understand Xtrabackup failure for database backup concepts. You are also expected to be a mentor and team builder, reliable, hardworking, self-motivated, and capable of working both independently and as part of a team to meet deadlines. Having an adoption attitude based on the situation is essential. Preferred skills include SME level knowledge of database administration and architecture, database design knowledge, experience in requirement analysis, component design, and resolving design issues. You should also be able to participate in project and application management teams to promote standards and best practices, provide estimates, and plan and manage database activities on deliverables. Expert knowledge in OS, network, and storage concepts is an added advantage.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

navi mumbai, maharashtra

On-site

You should have expertise in Elasticsearch, Logstash, Kibana, database management, performance tuning, monitoring, filebeat, and bitSets. Your responsibilities will include end-to-end implementation of the ELK Stack, covering Elasticsearch, Logstash, and Kibana. You should have a good understanding of Elasticsearch cluster, shards, replica, configuration, API, local gateway, mapping, indexing, operations, transaction logs, Lucene Indexing, multiple indices, index aliases, cross-index operations, configuration options, mappings, APIs, available settings, search query DSL, search components, aggregations, search types, highlighting, filebeat, bitSets, Lucene, aggregations, nested document relations, cluster state recovery, low-level replication, low-level recovery, shared allocation, performance tuning focusing on data flow and memory allocation, Kibana, cluster monitoring, Hadoop environment, infrastructure, tuning, troubleshooting of Elasticsearch and its stack, operating systems, networks, security, upgrade of Elasticsearch version, Kibana, Logstash, and collaborating with infrastructure, network, database, application, and business intelligence teams to ensure high data quality and availability. Your troubleshooting skills for Elasticsearch should be excellent.,

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

chennai, tamil nadu

On-site

You should have at least 4 years of experience in Python programming. Your experience should include working with Django and Flask. Additionally, you should have experience with Fast API, Kubernetes, building APIs, and creating microservices using Python. It is essential that you have the ability to think through and build API & SDK design. Expertise in Agile development methodology is a must. Your strong knowledge should encompass Design patterns, Security, and Performance tuning. Experience with Pydantic and linting using flake8 or similar processes in Python microservices is required. Hands-on experience with pyMongo integration and retrieval of Mongo collection is necessary. Integration experience with GraphQL & graph database is also expected. A good understanding and the ability to set up Neo4J database will be beneficial.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

haryana

On-site

As an SDE-II, you will have the opportunity to own modules, mentor junior team members, and play a significant role in technical design and product evolution. Your primary responsibility will be to deliver robust and scalable features using the MERN stack. You will be actively involved in architectural decisions, system performance improvements, and ensuring alignment with business outcomes. Taking full ownership of features from design to deployment will be a key aspect of your role. You will be tasked with building reusable and maintainable components utilizing technologies like React.js, Node.js, MongoDB, and Express. Collaboration with Product Managers, Designers, and QA professionals to deliver business-critical features will be essential. Additionally, you will participate in sprint planning, estimations, and release cycles. Your responsibilities will also include code reviews, mentoring junior developers, and enforcing best practices within the team. Improving application performance, scalability, and security will be crucial, along with contributing to DevOps, CI/CD pipelines, and automated testing as needed. Evaluating technical debt and proposing refactoring strategies will be part of your regular tasks. Staying updated with industry trends and integrating relevant tech innovations into projects will also be expected. The ideal candidate for this role should have experience in full-stack development, preferably with the MERN stack. A deep understanding of JavaScript/TypeScript, async patterns, and API development is essential. Hands-on experience with MongoDB design, indexing, and query optimization is required. Proficiency in version control using Git, testing tools, and build pipelines is crucial. Experience with performance tuning and debugging in production environments is highly beneficial, as well as a solid grasp of system design, data structures, and clean architecture. It would be advantageous for candidates to have exposure to microservices or serverless architecture. Experience with Docker, AWS, or CI/CD pipelines is a plus. Familiarity with product analytics, observability, and monitoring tools would also be beneficial. Contributions to open-source projects or tech blogs are considered a positive attribute. Domain knowledge in AEC, construction, or document workflows is a bonus for this position.,

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

noida, uttar pradesh

On-site

You are looking for Data Modelers to join our team. The preferred location for this role is Noida, with options also available in Bangalore and Pan India. As a Data Modeler, you should have around 8+ years of experience in data modeling, data warehousing, and ETL processes. Strong knowledge of ER Studio, data visualization, and SQL-based RDBMS is essential for this role. In this position, you will be responsible for developing conceptual, logical, and physical data models for data lakes, warehouses, and analytics. You will need to effectively translate business requirements into efficient data structures and collaborate with various teams such as business, engineering, and analytics to gather requirements and ensure successful project delivery. Additionally, you will play a key role in driving data governance initiatives, conducting data quality assessments, and managing metadata. Optimizing performance through indexing, partitioning, and tuning strategies will be crucial aspects of your responsibilities. It is also important to maintain documentation including Entity-Relationship Diagrams (ERDs), data dictionaries, and flow diagrams. Ideal candidates will also have familiarity with data governance practices, and experience with cloud platforms such as AWS, Azure, or GCP would be considered a plus. Knowledge of various modeling methodologies like ER, Dimensional, and Relational models is required. Proficiency in tools like ERwin, PowerDesigner, Lucidchart, and Visio will be beneficial for this role.,

Posted 1 week ago

Apply

5.0 - 10.0 years

0 Lacs

hyderabad, telangana

On-site

You should have at least 7-10 years of work experience and must have worked as a Database Tech Lead in Hyderabad with an office-based work setup. Your responsibilities will include: - Demonstrating over 5 years of professional experience as an MSSQL Developer or Database Developer. - Showcasing expertise in writing intricate SQL queries, stored procedures, and functions. - Having a strong command over query optimization, performance tuning, and database indexing. - Possessing knowledge or experience in Duck Creek Data Insights. - Demonstrating familiarity with ETL Processes (SSIS) and data integration techniques. - Having experience with Power BI or SSRS. - Being well-versed in database design principles and normalization techniques. - Having hands-on experience with one or more relational database management systems (e.g., SQL Server, Oracle, PostgreSQL, MySQL) will be considered a plus. - Exhibiting excellent problem-solving skills and attention to detail. - Demonstrating strong communication and teamwork skills. - Any leadership experience and the ability to mentor junior developers will be an added advantage.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a Customer-Focused Senior Site Reliability Engineer (SRE) at Integral, you will be part of the Site Reliability Engineering organization that aims to deliver unparalleled service levels to customers. The SRE team is focused on automation, real-time monitoring, and self-healing systems to provide a flawless 24/7 experience. Within the SRE team, specialized roles such as Platform SREs, Application Automation Engineers, and Customer-Focused SREs work together to create scalable, efficient, and future-ready solutions. The team is dedicated to innovation, leveraging AI to develop autonomous agents and automate complex processes, shaping the future of operational engineering. Joining Integral means being part of a team that not only solves critical challenges in real-time but also pioneers AI-driven solutions that transform work processes. You will have the opportunity to work on cutting-edge projects, collaborate with top engineers, and contribute to shaping a new era of automation. In the role of a Customer-Focused Senior SRE, your main focus will be on customer engagement, ensuring smooth onboarding, seamless integrations, and optimal performance. You will utilize your operational expertise along with innovative AI-driven approaches to design and implement autonomous systems that streamline processes and enhance reliability. Collaboration with Account Managers, Sales Solution Engineers, Infrastructure SREs, and Application Automation Engineers will be crucial to delivering a seamless, scalable customer experience while driving technical agility and scalability. Your responsibilities will include creating detailed solution documents for customers before onboarding, automating onboarding and deployment processes, monitoring and analyzing performance metrics to ensure customer satisfaction, investigating and resolving technical escalations, designing and building autonomous agents for process management, leveraging AI tools for automation, partnering with various teams to align solutions with customer needs, implementing real-time monitoring and self-healing systems, and ensuring standardization and scalability across customer processes. To excel in this role, you should have proven experience in Site Reliability Engineering, Operational Engineering, or related fields, expertise in automated deployment tools, monitoring systems, and performance tuning, knowledge of scripting and AI tools, strong communication and collaboration skills, a customer-centric approach, a growth mindset, and a passion for innovative AI-driven projects. As a Customer-Focused Senior SRE at Integral, your mission is to redefine the customer experience by combining your operational expertise with cutting-edge AI technologies. You will play a crucial role in ensuring smooth implementations, exceptional performance, and building the future of autonomous engineering at Integral.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

As an ideal candidate for this role, you should possess at least 3 years of hands-on experience in Oracle PL/SQL development. Your expertise should include a strong understanding of relational database concepts and a proven track record in performance tuning. Your primary responsibilities will involve designing, developing, and optimizing complex PL/SQL queries, stored procedures, functions, and packages. In addition, you will be expected to perform performance tuning and troubleshooting of Oracle databases to ensure optimal functionality. A key aspect of this role will be your ability to collaborate effectively with data architects and analysts. You will need to work closely with them to comprehend data requirements and subsequently translate them into effective technical solutions. Your strong experience in reconciliation will be highly beneficial in this position, allowing you to contribute effectively to the overall success of the team and organization.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

You are a highly skilled DBA Developer responsible for various IT development, analysis, information management (DBA), and QA activities that require expertise in specialized technologies. Your role is crucial in ensuring the performance, availability, and security of databases, as well as providing support in application development and system troubleshooting. Your responsibilities include modifying and maintaining existing databases and DBMS, designing and implementing logical and physical database models, creating and maintaining database documentation, analyzing business requirements for database solutions, developing backup, recovery, and security procedures, writing code for database access using SQL, PL/SQL, T-SQL, etc., collaborating with technical and business teams, estimating project implementation time and costs, monitoring database performance, troubleshooting issues, and supporting continuous improvement initiatives. You must possess strong communication skills, problem-solving abilities, and the capability to work independently while managing multiple priorities. Additionally, you should have in-depth knowledge of SDLC, SQL development, performance tuning, database platforms like Oracle, SQL Server, etc., database monitoring tools, ETL processes, data warehousing, and troubleshooting complex database and application issues. Familiarity with version control systems, CI/CD pipelines, NoSQL databases, scripting languages, cloud database platforms, and database security practices is advantageous. In summary, as a DBA Developer, you will play a vital role in database management, application development, and system optimization by leveraging your expertise in database technologies and problem-solving skills to meet project requirements effectively and contribute to the continuous enhancement of database solutions.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

maharashtra

On-site

As a Manager / Senior Manager - Oracle DBA based in Mumbai, you will be responsible for managing, maintaining, and optimizing Oracle databases to ensure high availability, security, and performance. Your role will involve working closely with cross-functional teams to support mission-critical systems and implement best practices for database management. Your expertise should include a deep understanding of Oracle 19c database architecture, configuration, and administration. You should have hands-on experience with Oracle GRID Infrastructure, ExaData, and RAC, as well as proficiency in Oracle TDE, OKV, and Database Vault implementation. Knowledge of Oracle Exadata and FarSync configurations, along with strong backup and recovery skills using RMAN and Data Pump, will be essential for this role. In terms of responsibilities, your tasks will include installing, configuring, and maintaining Oracle 19c databases on standalone and clustered environments (Oracle GRID). You will be expected to monitor and optimize database performance, troubleshoot issues, and conduct regular health checks to ensure smooth operations. Furthermore, your role will involve setting up and managing Oracle Exadata, FarSync, and other high availability solutions, as well as developing and maintaining backup and recovery strategies using RMAN and other tools. You will also be responsible for implementing and managing Oracle Transparent Data Encryption (TDE), Oracle Key Vault (OKV), and Oracle Database Vault for enhanced security and compliance. Additionally, you will play a key role in planning and executing database patching, upgrades, and migrations to maintain system reliability and supportability. Performance tuning will be a crucial aspect of your responsibilities, involving advanced optimization for high-performance workloads by analyzing AWR, ADDM, and other diagnostic tools. Documenting detailed database configurations, processes, and best practices, as well as automating routine tasks to enhance efficiency and minimize downtime, will also be part of your duties. Strong analytical and problem-solving skills, excellent communication and documentation abilities, and the ability to work collaboratively in a team environment are essential soft skills required for this role. If you meet these qualifications and are interested in this position, please email your resume to careers@cdslindia.com with the position applied for mentioned in the subject line.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

You will be joining as a Data Engineer in Hyderabad (Work from Office) with expertise in data engineering, ETL, and Snowflake development. Your primary responsibilities will include SQL scripting, performance tuning, Matillion ETL, and working with cloud platforms such as AWS, Azure, or GCP. A strong proficiency in Python or scripting languages, API integrations, and knowledge of data governance is essential for this role. Possession of Snowflake certifications (SnowPro Core/Advanced) is preferred. As a Data Engineer, you should have a minimum of 5+ years of experience in data engineering, ETL, and Snowflake development. Your expertise should encompass SQL scripting, performance tuning, and a solid understanding of data warehousing concepts. Hands-on experience with Matillion ETL for creating and managing ETL jobs is a key requirement. Additionally, you should demonstrate a strong understanding of cloud platforms (AWS, Azure, or GCP) and cloud-based data architectures. Proficiency in SQL, Python, or other scripting languages for automation and data transformation is crucial for this role. Experience with API integrations and data ingestion frameworks will be advantageous. Knowledge of data governance, security policies, and access control within Snowflake environments is also expected. Excellent communication skills are essential as you will be required to engage with both business and technical stakeholders. Being a self-motivated professional capable of working independently and delivering projects on time is highly valued in this position. The ideal candidate will possess expertise in data engineering, ETL processes, Snowflake development, SQL scripting, and performance tuning. Hands-on experience with Matillion ETL, cloud platforms (AWS, Azure, or GCP), and API integrations is crucial. Proficiency in Python or scripting languages along with knowledge of data governance, security policies, and access control will be beneficial for excelling in this role.,

Posted 1 week ago

Apply

7.0 - 11.0 years

0 Lacs

hyderabad, telangana

On-site

As a Senior EBS Developer/Technical Consultant at NetAnalytiks, you will be responsible for designing, developing, and integrating software applications, managing databases, and handling various programming tasks. Your role will involve close collaboration with clients and internal teams to deliver timely technical solutions. With a total of 7+ years of experience in Oracle EBS, you must possess strong skills in SQL, PL/SQL, and be well-versed with Analytical functions. Your ability to comprehend report code and its logic, along with experience in performance tuning will be essential. An excellent understanding of Oracle EBS tables and relationships, particularly in supply chain and financials, is required. Good communication skills and the capability to work with minimal supervision are key attributes for this role. This is a full-time, on-site position based in Bengaluru, with the option to work from either Hyderabad or Pune. If you meet the qualifications and are interested in this opportunity, we encourage you to apply for the position.,

Posted 1 week ago

Apply

5.0 - 12.0 years

0 Lacs

coimbatore, tamil nadu

On-site

As a Data Software Engineer, you will be responsible for utilizing your 5-12 years of experience in Big Data & Data-related technologies to contribute to the success of projects in Chennai and Coimbatore in a Hybrid work mode. You should possess an expert level understanding of distributed computing principles and a strong knowledge of Apache Spark, with hands-on programming skills in Python. Your role will involve working with technologies such as Hadoop v2, Map Reduce, HDFS, Sqoop, Apache Storm, and Spark-Streaming to build stream-processing systems. You should have a good grasp of Big Data querying tools like Hive and Impala, as well as experience in integrating data from various sources including RDBMS, ERP, and Files. Experience with NoSQL databases such as HBase, Cassandra, MongoDB, and knowledge of ETL techniques and frameworks will be essential for this role. You will be tasked with performance tuning of Spark Jobs, working with AZURE Databricks, and leading a team efficiently. Additionally, your expertise in designing and implementing Big Data solutions, along with a strong understanding of SQL queries, joins, stored procedures, and relational schemas will be crucial. As a practitioner of AGILE methodology, you will play a key role in the successful delivery of data-driven projects.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

MongoDB's mission is to empower innovators to create, transform, and disrupt industries by unleashing the power of software and data. We enable organizations of all sizes to easily build, scale, and run modern applications by helping them modernize legacy workloads, embrace innovation, and unleash AI. Our industry-leading developer data platform, MongoDB Atlas, is the only globally distributed, multi-cloud database and is available in more than 115 regions across AWS, Google Cloud, and Microsoft Azure. Atlas allows customers to build and run applications anywhere - on premises, or across cloud providers. With offices worldwide and over 175,000 new developers signing up to use MongoDB every month, it's no wonder that leading organizations, like Samsung and Toyota, trust MongoDB to build next-generation, AI-powered applications. MongoDB Technical Services Engineers use their exceptional problem-solving and customer service skills, along with their deep technical experience, to advise customers and to solve their complex MongoDB problems. Technical Service Engineers are experts in the entire MongoDB ecosystem - database server, drivers, cloud, and infrastructure. This also includes services such as Atlas (database as a service), or Cloud Manager (which helps customers with automation, backup, and monitoring of their MongoDB systems). Our engineers combine their MongoDB expertise with passion, initiative, teamwork, and a great sense of humor to help our customers to be successful with MongoDB. We are looking to speak to candidates who are based in Bangalore for our hybrid working model. Cool things you'll do: You'll be working alongside our largest customers, solving their complex challenges - resolving questions on architecture, performance, recovery, security, and everything in between. You'll be an expert resource on best practices in running MongoDB at scale, whatever that scale may be. You'll be an advocate for customers" needs - interfacing with our product management and development teams on their behalf. And you'll contribute to internal projects, including software development of support tools for performance, benchmarking, and diagnostics. This role specifically follows a weekend support model (Sunday to Thursday, with Friday and Saturday as the week-off) and requires adherence to EMEA Hours (2pm to 10pm IST). If you're passionate about being a Technical Services Engineer - Core and are open to flexible, weekend-oriented scheduling, we encourage you to apply! As an ideal candidate, you will have: We consider all candidates with an eye for those who are self-taught, curious, and multi-faceted. Our ideal TSE candidate should also have: - 5+ years of relevant experience - Strong technical experience in one (or more) of the following areas: Systems administration, Scalable and Highly available distributed systems, Network Administration, Database architecture and administration, Application Architecture, Data architecture and design, Performance tuning and benchmarking - A B.Tech / B.S. or equivalent work experience Nice to have: - Basic understanding of AI, including ML, LLMs, and RAG principles - Experience in one or more of: Java, Python, Ruby, C, C++, C#, Javascript, node.js, Go, PHP, or Perl It's crucial for every candidate that they can check off all of these boxes: - Excellent communication skills, both written and verbal - Genuine desire to help people - Uncontrollable urge to investigate and solve problems, with advanced diagnostic and troubleshooting skills - Ability to think on your feet, remain calm under pressure, and solve problems in real-time - Desire and ability to rapidly learn a wide variety of new technical skills - Strong teamwork: willingness and ability to get help from team members when required, and the good judgment to know when to seek help Success Measures: - In 3 months, you'll have gained a deep understanding of MongoDB and its ecosystem. You will complete New Hire Training. - In 6 months, you will be comfortable working frontline with our customers. You will also complete the MongoDB Certified DBA Associate exam. - In 12 months, you will work on gaining expertise to be a part of a technical experts group within the MongoDB ecosystem and will be helping your peer engineers in advance diagnostics. Also, you will be encouraged to handle technical escalations independently. To drive the personal growth and business impact of our employees, we're committed to developing a supportive and enriching culture for everyone. From employee affinity groups, to fertility assistance and a generous parental leave policy, we value our employees" wellbeing and want to support them along every step of their professional and personal journeys. Learn more about what it's like to work at MongoDB, and help us make an impact on the world! MongoDB is committed to providing any necessary accommodations for individuals with disabilities within our application and interview process. To request an accommodation due to a disability, please inform your recruiter. MongoDB is an equal opportunities employer.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

Job Description: As a leader at a Fortune Global 500 organization, you will be responsible for advocating various data science teams on best practices surrounding the development and implementation of advanced analytic systems and predictive and prescriptive models. Working closely with a team of data scientists, data analysts, data engineers, machine learning engineers, business and data domain owners, application developers, and architects, you will play a crucial role in creating and delivering insights from large and disparate data to enable confident business decisions. Your role will involve evaluating and adopting emerging technologies that support statistical modeling, machine learning, distributed computing, and run-time performance tuning to optimize processes and introduce new products and services to the market. Additionally, you will support senior leadership by planning and executing broad advanced analytics initiatives aimed at delivering value to both internal and external stakeholders. Furthermore, you may also be involved in managing people within the department. Responsibilities: - Lead and oversee the data analysts, data scientist team, machine learning engineers, and big data specialists to implement models and systems that provide optimal results and are scalable to meet future business needs. - Serve as a subject matter expert on UPS business processes, data, and advanced analytics capabilities to define problems, data, and model requirements using proven predictive and prescriptive techniques. - Maintain a broad understanding of implementation, integration, and inter-connectivity issues with emerging technologies to develop strategies that support the creation, development, and delivery of analytic solutions aligned with business needs. - Prototype algorithms to ensure analytic results address problem statements and business requirements effectively. - Interpret and analyze large-scale datasets to uncover insights supporting the building of analytic systems and predictive models while experimenting with new and emerging techniques. - Identify and evaluate cutting-edge open-source, data science/machine learning libraries, data platforms, and vendor solutions to prioritize and plan data projects across the enterprise. - Provide thought leadership, technical guidance, and counsel for data science project teams to evaluate strategic alternatives, recommend courses of action, and design and implement solutions. - Advocate for best practices in the adoption of Cloud-AI technologies, open-source software, machine learning libraries/packages, and data science platforms to derive actionable insights empowering business decisions. - Communicate effectively with business customers and the senior leadership team with varying levels of technical knowledge, educate them about systems, and share insights and recommendations to inform business strategies. - Manage analytics projects/teams and act as a point of contact to ensure alignment of team actions and communication with stakeholders to keep projects on track with goals. Qualifications: - Ability to engage key business and executive-level stakeholders to translate business problems into high-level analytics solutions. - Extensive experience working with large-scale, complex datasets to develop machine learning, predictive, forecasting, and optimization models. - Proven track record of handling ambiguity, prioritizing needs, and delivering results in a dynamic environment. - Expertise in data management pipelines involving data extraction, analysis, and transformation using data querying languages (e.g. SQL, NoSQL, BQ) or scripting languages (e.g. Python, R) and/or statistical/mathematical software (e.g. R, Matlab, SAS). - Hands-on experience in launching moderate to large-scale advanced analytics projects in production at scale; utilizing Cloud-AI technologies, machine learning frameworks, and enterprise data science platforms. - Ability to explain technical concepts to non-experts, strong analytical skills, attention to detail. - Direct experience in developing analytical solutions empowering business decisions and product creation using various techniques (e.g. Supervised, Unsupervised, Deep Learning, NLP). - Excellent verbal and written communication skills with the ability to convey data through a story framework, present data-driven results to technical and non-technical audiences, and advocate technical solutions to diverse groups. - Master's Degree in a quantitative field such as mathematics, computer science, physics, economics, engineering, statistics, or equivalent job experience. Employee Type: Permanent UPS is committed to providing a workplace free of discrimination, harassment, and retaliation.,

Posted 1 week ago

Apply

7.0 - 11.0 years

0 Lacs

noida, uttar pradesh

On-site

At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity We're looking for candidates with Syniti and other programming skills to join the EY GDS SAP BI & Data. This is a fantastic opportunity to be part of a leading firm while being instrumental in the growth. Your key responsibilities include: - Providing expert level business analysis on SAP modules FI, CO, MM, SD, PM, PP, PS - Implementing and developing customer deliverables that meet or exceed customer requirements - Developing and demonstrating a good understanding of business processes for the assigned functional area/data objects - Demonstrating a strong knowledge of underlying technical data structures and definitions for the assigned functional process area/data objects - Contributing to an integrated data solution through data analysis, reporting, and collaboration with on-site colleagues and clients - Expertise in SAP BW7.5, SAP BW on HANA/BW4 HANA - Working closely with other consultants on customer site as part of small to large size project teams - Conducting requirements analysis, data analysis, and creating reports - Maintaining responsibility for completion and accuracy of the deliverables - Actively expanding consulting skills and professional development through training courses, mentoring, and daily interaction with clients Skills and attributes for success: - Hands-on experience of SAP BW7.5, HANA implementation, and support - Building an understanding of Standard and custom SAP BW Extractors functionality with ABAP Debugging skills - Prior experience in Supporting ETL and Incident management/Bug-Fix - Hands-on Experience in Understanding and Applying transformations using ABAP and AMDP, advanced DSOs, Composite Providers using LSA ++ and performance optimization concepts - Prior Experience with Traditional non-HANA BW data modeling, Multi-Cubes, ODS Objects, Info Cubes, Transfer Rules, Start Routines, End Routines, Info Set Queries, Info Objects, and User Exits - Hands-on experience with SAP HANA data modeling views (Attribute, Analytics, and Calculation views) - Proficient in Development and understanding of SAP Analysis for Microsoft Office to perform custom calculations, filtering, and sorts to support complex business planning and reporting scenarios - Hands-on experience in the collection of Transport Requests through the landscape - Experience in Performance tuning and troubleshooting /Monthly Release activities as necessary - Knowledge of SAP ECC Business processes, functional aspects in Sales, Billing, Finance, Controlling, Project systems To qualify for the role, you must have: - Minimum 7+ years of SAP Analytics/Business Intelligence/Business Warehouse (BI/BW/HANA) related experience with a professional services advisory firm or publicly traded company and experience leading and delivering full lifecycle implementations - Minimum 1 end to end implementation experience with SAP HANA 1.0 and 2.0 with at least 1 number of full lifecycle project implementation experience with SAP HANA SQL and/or SAP S/4 HANA Ideally, you should also have: - Bachelor's degree from an accredited college/university - Hands-on experience on SAP HANA Modeling: Table creation (row store, column store), ABAP Procedures, data modeling, modeling views (Calculation, Attributes views), decision tables, analytical privilege will be an added advantage - Knowledge in roles and authorizations What we look for: - A team of people with commercial acumen, technical experience, and enthusiasm to learn new things in this fast-moving environment with consulting skills - An opportunity to be a part of a market-leading, multi-disciplinary team of 1400+ professionals, in the only integrated global transaction business worldwide - Opportunities to work with EY Consulting practices globally with leading businesses across a range of industries EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people, and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate. Working across assurance, consulting, law, strategy, tax, and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.,

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies