Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
1.0 - 5.0 years
2 - 6 Lacs
Ahmedabad
Work from Office
Responsible for testing of Raw material,IP/ FP and Stability sample as per method of analysis Should have knowledge for the Method Verification/validation and Method Transfer analysis Have an exposure on operation, calibration, Qualification and maintenance of laboratory instruments / equipments. Have an exposure for the preparation of calibration & PM schedule of laboratory instruments / equipments. Should have knowledge for the preparation, handling & management of working/reference standards etc. Testing of In-process, stability and Finished product samples Should have awareness of Caliber LIMS system operation.
Posted 3 weeks ago
1.0 - 5.0 years
2 - 6 Lacs
Ahmedabad
Work from Office
Responsible for testing of Raw material,IP/ FP and Stability sample as per method of analysis. Should have knowledge for the Method Verification/validation and Method Transfer analysis. Have an exposure on operation, calibration, Qualification and maintenance of laboratory instruments / equipments . Should have exposure on sophisticated Instruments i.e. HPLC, Dissolution,FTIR,UV, GC, Autotitrator,Karl fischer,PSD etc. Have an exposure for the preparation of calibration & PM schedule of laboratory instruments / equipments. Should have knowledge for the preparation, handling & management of working/reference standards etc. Testing of In-process, Stability and Finished product samples,Raw material. Should have awareness of Caliber LIMS system operation. Having exposure of Method transfer/verification /validation activity.
Posted 3 weeks ago
4.0 - 8.0 years
9 - 10 Lacs
Pune
Work from Office
Our industry leading segmentation and AI-driven matching technologies help consumers find better solutions and brands faster. They allow brands to target and reach in-market customer prospects with pinpoint segment-by-segment accuracy, and to pay only for performance results. Our campaign-results-driven matching decision engines and optimization algorithms are built from over 20 years and billions of dollars of online media experience. We believe in: The direct measurability of digital media. Performance marketing. (We pioneered it.) The advantages of technology. We bring all this together to deliver truly great results for consumers and brands in the world s biggest channel. Team Name: Finance - Insurance Location: Pune, India (Hybrid) Experience: 4+ years Employment Type: Full-Time Key Responsibilities Design, develop, and maintain scalable and high-performance Java-based applications. Build and integrate RESTful APIs with third-party systems. Collaborate with cross-functional teams to define, design, and ship new features. Write clean, maintainable, and efficient code following best practices. Work very closely with QA engineers, architects, tech leads on several projects at a time Participate in code reviews and provide constructive feedback to peers. Develop unit and integration tests to ensure software quality. Technical Requirements 4+ years of hands-on software development experience in Java (version 8 or above) with knowledge of Object-Oriented Design. 3+ year experience with Spring Boot, Web Services, Micro Services, REST API, Json and XML parsers. Good understanding of how web applications work including knowledge of the HTTP protocol, request/response cycles and session management Knowledge of SQL and experience working with RDBMS like Oracle or MySQL Experience working in a UNIX and/or LINUX environment Good communication skills and the ability to work directly with project managers and engineering teams Please see QuinStreet s Employee Privacy Notice here.
Posted 3 weeks ago
5.0 - 7.0 years
14 - 19 Lacs
Hyderabad
Work from Office
Do you want to work on innovative projects, collaborate with a dynamic and supportive team, and receive investment in your professional development? At DTCC, we are at the forefront of innovation in the financial markets. We are committed to helping our employees grow and succeed. We believe that you have the skills and drive to make a real impact. We foster a thriving internal community and are committed to creating a workplace that looks like the world that we serve. Pay and Benefits: Competitive compensation, including base pay and annual incentive Comprehensive health and life insurance and well-being benefits, based on location Pension / Retirement benefits Paid Time Off and Personal/Family Care, and other leaves of absence when needed to support your physical, financial, and emotional well-being. DTCC offers a flexible/hybrid model of 3 days onsite and 2 days remote (onsite Tuesdays, Wednesdays and a third day unique to each team or employee). The Impact you will have in this role: We are seeking a skilled database expert to join our dynamic team. The ideal candidate will have expertise in developing and resolving technical issues around the deployment and use of SQL Server. The candidate should be eager to learn, able to prioritize tasks, manage multiple projects, and effectively communicate results with a globally located team. What Youll Do: Collaborate with Architects, Business Analysts, Developers and Stakeholders to analyze requirements for our SQL Server database Ensure data integrity, stability, and security across databases and related web applications Establish and enforce data modeling standards and best practices for database-centric processes such as ETL and batch jobs Assist in troubleshooting and resolving technical issues related to databases and web applications, ensuring minimal downtime and optimal performance. Collaborate with cross-functional teams to gather requirements, design solutions, and implement troubleshooting strategies. Document and track issues, resolutions, and best practices to improve the overall delivery process. Provide technical support during production releases and maintenance windows, working closely with the Troubleshooting and Operations team. Stay up-to-date with the latest industry trends and best practices in delivery and technical support Qualifications: Bachelors degree in Computer Science, Information Technology, or a related field. Talents Needed for Success: Expertise in a modern RDBMS such as SQL Server (preferred) or Oracle including database management and query writing. Minimum of 5-7 years of experience in delivery, troubleshooting and support. The use of databases and integration patterns in web applications, Familiarity with ETL applications such as Talend. Experience with job scheduling and automation. Strong analytical and problem-solving skills, with the ability to work independently and as part of a team. Excellent communication and interpersonal skills, with the ability to collaborate effectively with stakeholders at all levels. Additional Skills needed for Success: Familiarity with cloud platforms and services (e.g., AWS, Azure, Google Cloud) is beneficial. Experience with version control systems (e.g., Git) and collaboration tools (e.g., Jira, Confluence) is desired Knowledge of scripting languages such as Python and Shell/Batch programming is a plus. Understanding of Agile processes and methodologies, with experience in working in an Agile framework using Scrum. Please contact us to request accommodation.
Posted 3 weeks ago
5.0 - 8.0 years
7 - 11 Lacs
Pune
Work from Office
Role Purpose Installation Golden Gate, Golden Gate configuration, Troubleshoot Golden Gate sync issues . Experience in Sybase to Oracle Database Migration using Golden Gate . Experience in Oracle 19c database version and feature. Experience in Database migration activities. Install , setup & maintain Oracle real application cluster , Troubleshoot RAC , ASM issues. Database Build, Upgradation, Patching activities. Experience in performance tuning on Oracle 19c Database version. Process/Performance improvement, Database stability improvement. Setup Oracle Data Guard and troubleshot data sync issues. Strong Backup and Restore strategies during disaster scenario. RMAN Optimization. Database standards/Best Practices implementation. Basic knowledge in Sybase RDBMS . Handle Ownership of Databases and provide SME Level DBA support for designated database servers. Do Provide adequate support in architecture planning, migration & installation for new projects in own tower (platform/dbase/ middleware/ backup) Lead the structural/ architectural design of a platform/ middleware/ database/ back up etc. according to various system requirements to ensure a highly scalable and extensible solution Conduct technology capacity planning by reviewing the current and future requirements Utilize and leverage the new features of all underlying technologies to ensure smooth functioning of the installed databases and applications/ platforms, as applicable Strategize & implement disaster recovery plans and create and implement backup and recovery plans Manage the day-to-day operations of the tower Manage day-to-day operations by troubleshooting any issues, conducting root cause analysis (RCA) and developing fixes to avoid similar issues. Plan for and manage upgradations, migration, maintenance, backup, installation and configuration functions for own tower Review the technical performance of own tower and deploy ways to improve efficiency, fine tune performance and reduce performance challenges Develop shift roster for the team to ensure no disruption in the tower Create and update SOPs, Data Responsibility Matrices, operations manuals, daily test plans, data architecture guidance etc. Provide weekly status reports to the client leadership team, internal stakeholders on database activities w.r.t. progress, updates, status, and next steps Leverage technology to develop Service Improvement Plan (SIP) through automation and other initiatives for higher efficiency and effectiveness Team Management Resourcing Forecast talent requirements as per the current and future business needs Hire adequate and right resources for the team Train direct reportees to make right recruitment and selection decisions Talent Management Ensure 100% compliance to Wipro’s standards of adequate onboarding and training for team members to enhance capability & effectiveness Build an internal talent pool of HiPos and ensure their career progression within the organization Promote diversity in leadership positions Performance Management Set goals for direct reportees, conduct timely performance reviews and appraisals, and give constructive feedback to direct reports. Ensure that organizational programs like Performance Nxt are well understood and that the team is taking the opportunities presented by such programs to their and their levels below Employee Satisfaction and Engagement Lead and drive engagement initiatives for the team Track team satisfaction scores and identify initiatives to build engagement within the team Proactively challenge the team with larger and enriching projects/ initiatives for the organization or team Exercise employee recognition and appreciation Deliver NoPerformance ParameterMeasure1Operations of the towerSLA adherence Knowledge management CSAT/ Customer Experience Identification of risk issues and mitigation plans Knowledge management2New projectsTimely delivery Avoid unauthorised changes No formal escalations Mandatory Skills: Oracle Database Admin. Experience5-8 Years.
Posted 3 weeks ago
4.0 - 7.0 years
15 - 19 Lacs
Hyderabad
Work from Office
Overview The purpose of the Mulesoft Senior Developer/Tech Lead is to design and articulate Mulesoft Integration solutions based on strategic business and technical requirements and lead the implementation of the same. The candidate will work with a variety of people to ensure that maximum value is delivered to the business. The candidate must demonstrate progressive leadership in the full life cycle of the software development environment. Additionally, the role is critical to build internal PepsiCo knowledge of the technical solution being built and maintain continuity as SI resources roll off. Responsibilities Sound knowledge of Mulesoft Integration technology and experience in development/delivering solution Collaborate with business stakeholders to understand requirements and translate them into technical solutions Develop and maintain complex integration flows using MuleSoft's Anypoint Platform Troubleshoot, debug, and optimize MuleSoft applications for performance and stability Mentor and guide junior developers on MuleSoft best practices and technologies Participate in code reviews and ensure adherence to coding standards and Pepsico best practices Contribute to the development of MuleSoft integration roadmaps and strategies Configure and customize MuleSoft Anypoint Platform components such as API Manager, Anypoint Studio, and Runtime Manager to build scalable and reliable integration solutions Use deep business knowledge of MuleSoft products to help assist with estimation for major new initiatives. Troubleshoot key implementation issues and demonstrate ability to drive to successful resolution. Develop Mulesoft POCs based on customer and project needs Contribute to Salesforce team building activities by creating reference architectures, common patterns, data models and re-usable assets that empower our stake holders to maximize business value using the breadth of the Mulesoft solutions available, also harvesting knowledge from existing implementations. Create documentation of operational tasks, procedures and automated processes. Evangelize and educate internal stakeholders about MuleSoft Trends and technology enhancements (e.g. CoPs, Workshops, or Conferences). Qualifications Bachelors degree in information technology, Computer Science, MIS, Business, or similar field. 9-12 years of overall IT experience with a strong focus on integration technologies and related contributions in a large enterprise. At least 5+ years of experience and Strong understanding of API-led connectivity and API management principles with prior experience in in designing and implementing RESTful and SOAP web services Java, Middleware, Mule ESB, Strong knowledge of SOA and experience in designing Mule Interfaces, Fluency in Web Service Standards such as XML, SOAP, REST, Strong understanding of RDBMS Mandatory Technical Skills: Mule ESB, SOA, JSon, XSD, XML Hands-on experience in design and development of complex use cases using MuleSoft. Strong governance and drive towards promotion of Mule best practices, guardrail adherence, API Delivery Model Good Knowledge of MuleSoft Admin and monitoring Experience with CI/CD pipelines and deployment strategies Agile delivery experience. Strong Oral and Written communications skills MCD Level 1 and Level 2.
Posted 3 weeks ago
6.0 - 8.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Job Title:ETL SME Experience6-8Years Location:Bangalore : Technical Skills: Sound Knowledge of Data warehousing concepts, terminology, and Architecture. Excellent ETL (Including XML) Design and development Experience Good understanding of the commonly used transformations like Source qualifier, joiners, union, aggregator, sorter, expressions, transaction control, xml transformation, lookups etc Ability to understand mapping document and translate the same to Informatica Mapping Ability to create CDC, SCD, Fact loads and complex mappings with Informatica Sound concepts of RDBMS with hands on with performance tuning of ETL / DB / SQL Queries Good knowledge of UNIX with ability to create Unix/Linux shell scripts to run ETL jobs and trap any failures. Skill in gathering and documenting user requirements and writing technical specifications Good knowledge of SDLC for projects using DW concepts and Client/Server technologies. Create detail mapping documents, module spec and unit test cases Hand on with informatica development, Configuration and Performance tuning Non-Technical Skill Well-developed analytical & problem-solving skills Strong oral and written communication skills Excellent team player, able to work with virtual teams Ability to learn quickly in a dynamic start-up environment Able to talk to client directly and report to client/onsite Flexibility to work on different Shifts and Stretch Flexible to travel and relocate in India and Abroad
Posted 3 weeks ago
5.0 - 10.0 years
15 - 25 Lacs
South Goa, Panaji, Pune
Hybrid
Deliver Product features e2e on AWS/Azure Cloud Collaborate directly with US client on product/feature requirements Independently design e2e features/modules Leverage Copilot/Cursor to maximize automation in dev cycle Oversee QA cycle on features Required Candidate profile 5+ y of exp in building Cloud apps on AWS or Azure Python: DJango or Flask or FastAPI RDBMS & NoSQL DB System architecture and design Exp across programming languages
Posted 3 weeks ago
8.0 - 10.0 years
11 - 18 Lacs
Kolkata
Work from Office
Role Responsibilities : - Design and implement data pipelines using MS Fabric. - Develop data models to support business intelligence and analytics. - Manage and optimize ETL processes for data extraction, transformation, and loading. - Collaborate with cross-functional teams to gather and define data requirements. - Ensure data quality and integrity in all data processes. - Implement best practices for data management, storage, and processing. - Conduct performance tuning for data storage and retrieval for enhanced efficiency. - Generate and maintain documentation for data architecture and data flow. - Participate in troubleshooting data-related issues and implement solutions. - Monitor and optimize cloud-based solutions for scalability and resource efficiency. - Evaluate emerging technologies and tools for potential incorporation in projects. - Assist in designing data governance frameworks and policies. - Provide technical guidance and support to junior data engineers. - Participate in code reviews and ensure adherence to coding standards. - Stay updated with industry trends and best practices in data engineering. Qualifications : - 8+ years of experience in data engineering roles. - Strong expertise in MS Fabric and related technologies. - Proficiency in SQL and relational database management systems. - Experience with data warehousing solutions and data modeling. - Hands-on experience in ETL tools and processes. - Knowledge of cloud computing platforms (Azure, AWS, GCP). - Familiarity with Python or similar programming languages. - Ability to communicate complex concepts clearly to non-technical stakeholders. - Experience in implementing data quality measures and data governance. - Strong problem-solving skills and attention to detail. - Ability to work independently in a remote environment. - Experience with data visualization tools is a plus. - Excellent analytical and organizational skills. - Bachelor's degree in Computer Science, Engineering, or related field. - Experience in Agile methodologies and project management.
Posted 3 weeks ago
8.0 - 10.0 years
11 - 18 Lacs
Bengaluru
Work from Office
Company Overview : Zorba Consulting India is a leading consultancy firm focused on delivering innovative solutions and strategies to enhance business performance. With a commitment to excellence, we prioritize collaboration, integrity, and customer-centric values in our operations. Our mission is to empower organizations by transforming data into actionable insights and enabling data-driven decision-making. We are dedicated to fostering a culture of continuous improvement and supporting our team members' professional development. Role Responsibilities : - Design and implement data pipelines using MS Fabric. - Develop data models to support business intelligence and analytics. - Manage and optimize ETL processes for data extraction, transformation, and loading. - Collaborate with cross-functional teams to gather and define data requirements. - Ensure data quality and integrity in all data processes. - Implement best practices for data management, storage, and processing. - Conduct performance tuning for data storage and retrieval for enhanced efficiency. - Generate and maintain documentation for data architecture and data flow. - Participate in troubleshooting data-related issues and implement solutions. - Monitor and optimize cloud-based solutions for scalability and resource efficiency. - Evaluate emerging technologies and tools for potential incorporation in projects. - Assist in designing data governance frameworks and policies. - Provide technical guidance and support to junior data engineers. - Participate in code reviews and ensure adherence to coding standards. - Stay updated with industry trends and best practices in data engineering. Qualifications : - 8+ years of experience in data engineering roles. - Strong expertise in MS Fabric and related technologies. - Proficiency in SQL and relational database management systems. - Experience with data warehousing solutions and data modeling. - Hands-on experience in ETL tools and processes. - Knowledge of cloud computing platforms (Azure, AWS, GCP). - Familiarity with Python or similar programming languages. - Ability to communicate complex concepts clearly to non-technical stakeholders. - Experience in implementing data quality measures and data governance. - Strong problem-solving skills and attention to detail. - Ability to work independently in a remote environment. - Experience with data visualization tools is a plus. - Excellent analytical and organizational skills. - Bachelor's degree in Computer Science, Engineering, or related field. - Experience in Agile methodologies and project management.
Posted 3 weeks ago
8.0 - 10.0 years
11 - 18 Lacs
Ahmedabad
Work from Office
Company Overview : Zorba Consulting India is a leading consultancy firm focused on delivering innovative solutions and strategies to enhance business performance. With a commitment to excellence, we prioritize collaboration, integrity, and customer-centric values in our operations. Our mission is to empower organizations by transforming data into actionable insights and enabling data-driven decision-making. We are dedicated to fostering a culture of continuous improvement and supporting our team members' professional development. Role Responsibilities : - Design and implement data pipelines using MS Fabric. - Develop data models to support business intelligence and analytics. - Manage and optimize ETL processes for data extraction, transformation, and loading. - Collaborate with cross-functional teams to gather and define data requirements. - Ensure data quality and integrity in all data processes. - Implement best practices for data management, storage, and processing. - Conduct performance tuning for data storage and retrieval for enhanced efficiency. - Generate and maintain documentation for data architecture and data flow. - Participate in troubleshooting data-related issues and implement solutions. - Monitor and optimize cloud-based solutions for scalability and resource efficiency. - Evaluate emerging technologies and tools for potential incorporation in projects. - Assist in designing data governance frameworks and policies. - Provide technical guidance and support to junior data engineers. - Participate in code reviews and ensure adherence to coding standards. - Stay updated with industry trends and best practices in data engineering. Qualifications : - 8+ years of experience in data engineering roles. - Strong expertise in MS Fabric and related technologies. - Proficiency in SQL and relational database management systems. - Experience with data warehousing solutions and data modeling. - Hands-on experience in ETL tools and processes. - Knowledge of cloud computing platforms (Azure, AWS, GCP). - Familiarity with Python or similar programming languages. - Ability to communicate complex concepts clearly to non-technical stakeholders. - Experience in implementing data quality measures and data governance. - Strong problem-solving skills and attention to detail. - Ability to work independently in a remote environment. - Experience with data visualization tools is a plus. - Excellent analytical and organizational skills. - Bachelor's degree in Computer Science, Engineering, or related field. - Experience in Agile methodologies and project management.
Posted 3 weeks ago
5.0 - 7.0 years
10 - 15 Lacs
Mumbai, Ahmedabad, Bengaluru
Work from Office
Development, deployment, maintenance, and troubleshooting of backend service systems including APIs, database, access control, event log, etc. Responsible for workflow automation, process optimization, and performance tuning Able to work independently and in a team environment Experience with Backend technologies, such as Docker, Socket, API service, and RDBMS. Experience with Frontend technologies is a plus. (HTML, JavaScript, and jQuery) Can communicate fluently in English. Proficient in JAVA with some knowledge of HTML, Javascript & jQuery Location: Remote- Delhi / NCR,Bangalore/Bengaluru,Hyderabad/Secunderabad,Chennai,Pune,Kolkata,Ahmedabad,Mumbai
Posted 3 weeks ago
4.0 - 6.0 years
15 - 18 Lacs
Chennai, Jaipur, Bengaluru
Work from Office
Contract Role Proficient in Golang and No SQL Expert in GoLang API development Experienced in cloud platforms (AWS/Azure) and building microservices using REST-API. Strong in ORMs (e.g., Entity Framework), RDBMS, and PL/SQL. Skilled in Angular with effective team and project management abilities. Location-Bengaluru,Chennai,Jaipur,Pune
Posted 3 weeks ago
8.0 - 10.0 years
11 - 18 Lacs
Chennai
Work from Office
Role Responsibilities : - Design and implement data pipelines using MS Fabric. - Develop data models to support business intelligence and analytics. - Manage and optimize ETL processes for data extraction, transformation, and loading. - Collaborate with cross-functional teams to gather and define data requirements. - Ensure data quality and integrity in all data processes. - Implement best practices for data management, storage, and processing. - Conduct performance tuning for data storage and retrieval for enhanced efficiency. - Generate and maintain documentation for data architecture and data flow. - Participate in troubleshooting data-related issues and implement solutions. - Monitor and optimize cloud-based solutions for scalability and resource efficiency. - Evaluate emerging technologies and tools for potential incorporation in projects. - Assist in designing data governance frameworks and policies. - Provide technical guidance and support to junior data engineers. - Participate in code reviews and ensure adherence to coding standards. - Stay updated with industry trends and best practices in data engineering. Qualifications : - 8+ years of experience in data engineering roles. - Strong expertise in MS Fabric and related technologies. - Proficiency in SQL and relational database management systems. - Experience with data warehousing solutions and data modeling. - Hands-on experience in ETL tools and processes. - Knowledge of cloud computing platforms (Azure, AWS, GCP). - Familiarity with Python or similar programming languages. - Ability to communicate complex concepts clearly to non-technical stakeholders. - Experience in implementing data quality measures and data governance. - Strong problem-solving skills and attention to detail. - Ability to work independently in a remote environment. - Experience with data visualization tools is a plus. - Excellent analytical and organizational skills. - Bachelor's degree in Computer Science, Engineering, or related field. - Experience in Agile methodologies and project management.
Posted 3 weeks ago
3.0 - 5.0 years
16 - 20 Lacs
Pune
Work from Office
Key Areas of Responsibilities Develops and maintain Order Execution Management System by designing, developing, and supporting business requirements, following the software development lifecycle. Individual contributor who will develop quality code. Proactively engage and communicate with other teams like other Development, QA and Support as required. Ensure full compliance to all CLSA and required training and regulatory requirements for the team Requirements 3-5 years of hands-on experience in Core Java application design and development. Basic knowledge of RDBMS and middleware/messaging systems like Solace and Tibco EMS. Skilled in building efficient, reusable, and reliable Java code and translating use cases into functional apps. Proficient in optimizing performance, identifying issues, and implementing solutions. Willing to self-learn, familiar with agile practices, and able to work independently in a fast-paced environment. Stay informed on CITIC CLSA Job Opportunities
Posted 3 weeks ago
2.0 - 4.0 years
4 - 6 Lacs
Mumbai, Hyderabad
Work from Office
Job Responsibilities. Hands on (Full Stack) SaaS application design and development concentrating on Artificial intelligence using big data technologies.. Facilitate solution efficiencies, scalability and technology stack leadership.. Ensure fool proof and robust applications through unit tests and other quality control measures. Follow an agile development process, and enable rapid solutions to business challenges.. Take inputs from internal and external clients and constantly strive to improve solutions.. Setup and following software design, development, testing and documentation best practices.. Extract and parse data from online and local data sources; Clean up data, audit data for accuracy, consistency and completeness.. Apply analytics and artificial intelligence techniques and extract valuable and actionable insights. Data processing and visualization:Summarize insights in simple yet powerful charts, reports, slides, etc.. For internal and external client projects, use our proprietary tools for performing data engineering, analytics and visualization activities. Responsible for project deliveries, escalation, continuous improvement, and customer success.. Candidate Profile. Strong knowledge of Python, Django, React, Flask, Flutter, Bootstrap, HTML,CSS, Javascript, SQL.. Expertise in Full Stack Web Application development is required.. Proficiency in software development:Conceptualizing, Designing, Coding, debugging.. Prior Background in AI, ML, Deep Learning, Image processing and computer vision is preferred.. Strong in algorithm design, database programming (RDBMS), and text analytics.. Knowledge of MongoDB, Redis, Docker, Containerization, Kubernetes, AWS, Kafka is preferred. High problem-solving skills:able to logically breakdown problems into incremental milestones, prioritize high impact deliverables first, identify bottlenecks and work around them.. Self-learner:highly curious, self-starter and can work with minimum supervision and guidance.. Entrepreneurial mind set with a positive attitude is a must.. Track record of excellence in academics or non-academic areas, with significant accomplishments.. Qualifications. E / B. 3 to 8 years of relevant experience. (ref:hirist.tech).
Posted 3 weeks ago
2.0 - 5.0 years
4 - 7 Lacs
Chennai
Work from Office
Experience: 2+ yrs of experience in IT, with At least 1+ years of experience with cloud and system administration. At least 2 years of experience with and strong understanding of 'big data' technologies in Hadoop ecosystem Hive, HDFS, Map/Reduce, Flume, Pig, Cloudera, HBase Sqoop, Spark etc. Job Overview: Smartavya Analytica Private Limited is seeking an experienced Hadoop Administrator to manage and support our Hadoop ecosystem. The ideal candidate will have strong expertise in Hadoop cluster administration, excellent troubleshooting skills, and a proven track record of maintaining and optimizing Hadoop environments. Key Responsibilities: • Install, configure, and manage Hadoop clusters, including HDFS, YARN, Hive, HBase, and other ecosystem components. Monitor and manage Hadoop cluster performance, capacity, and security. Perform routine maintenance tasks such as upgrades, patching, and backups. Implement and maintain data ingestion processes using tools like Sqoop, Flume, and Kafka. Ensure high availability and disaster recovery of Hadoop clusters. Collaborate with development teams to understand requirements and provide appropriate Hadoop solutions. Troubleshoot and resolve issues related to the Hadoop ecosystem. Maintain documentation of Hadoop environment configurations, processes, and procedures. Requirement: • Experience in Installing, configuring and tuning Hadoop distributions. Hands on experience in Cloudera. Understanding of Hadoop design principals and factors that affect distributed system performance, including hardware and network considerations. Provide Infrastructure Recommendations, Capacity Planning, work load management. Develop utilities to monitor cluster better Ganglia, Nagios etc. Manage large clusters with huge volumes of data Perform Cluster maintenance tasks Create and removal of nodes, cluster monitoring and troubleshooting Manage and review Hadoop log files Install and implement security for Hadoop clusters Install Hadoop Updates, patches and version upgrades. Automate the same through scripts Point of Contact for Vendor escalation. Work with Hortonworks in resolving issues Should have Conceptual/working knowledge of basic data management concepts like ETL, Ref/Master data, Data quality, RDBMS Working knowledge of any scripting language like Shell, Python, Perl Should have experience in Orchestration & Deployment tools. Academic Qualification:
Posted 3 weeks ago
8.0 - 10.0 years
11 - 18 Lacs
Hyderabad
Work from Office
Role Responsibilities : - Design and implement data pipelines using MS Fabric. - Develop data models to support business intelligence and analytics. - Manage and optimize ETL processes for data extraction, transformation, and loading. - Collaborate with cross-functional teams to gather and define data requirements. - Ensure data quality and integrity in all data processes. - Implement best practices for data management, storage, and processing. - Conduct performance tuning for data storage and retrieval for enhanced efficiency. - Generate and maintain documentation for data architecture and data flow. - Participate in troubleshooting data-related issues and implement solutions. - Monitor and optimize cloud-based solutions for scalability and resource efficiency. - Evaluate emerging technologies and tools for potential incorporation in projects. - Assist in designing data governance frameworks and policies. - Provide technical guidance and support to junior data engineers. - Participate in code reviews and ensure adherence to coding standards. - Stay updated with industry trends and best practices in data engineering. Qualifications : - 8+ years of experience in data engineering roles. - Strong expertise in MS Fabric and related technologies. - Proficiency in SQL and relational database management systems. - Experience with data warehousing solutions and data modeling. - Hands-on experience in ETL tools and processes. - Knowledge of cloud computing platforms (Azure, AWS, GCP). - Familiarity with Python or similar programming languages. - Ability to communicate complex concepts clearly to non-technical stakeholders. - Experience in implementing data quality measures and data governance. - Strong problem-solving skills and attention to detail. - Ability to work independently in a remote environment. - Experience with data visualization tools is a plus. - Excellent analytical and organizational skills. - Bachelor's degree in Computer Science, Engineering, or related field. - Experience in Agile methodologies and project management.
Posted 3 weeks ago
7.0 - 12.0 years
8 - 14 Lacs
Hyderabad
Work from Office
The Role : Were looking for a hands-on Engineering Lead who combines technical depth with product intuition. Youll guide a team of junior engineers (currently India-based), set the standard for modern software practices, and deliver robust tools that enable brokers to submit deals, SMEs to provide data, underwriters to assess risk, and internal teams to manage loans across their lifecycle. Youll also drive adoption of automation and AI where it meaningfully enhances speed, accuracy, or user experience. You Will : - Set technical direction and long-term vision for the platform, aligning engineering efforts with business objectives. - Lead design and development of front-end and back-end features across the full SME lending platform, including onboarding, underwriting, funding, and loan servicing modules. - Work closely with product leadership to shape features, scope timelines, and deliver iteratively. - Mentor junior developers, review code, and enforce high standards across the team. - Drive mobile-friendly UI development using React (or similar frameworks). - Own integration of external APIs for Open Banking, KYC/KYB, credit scoring, and more. - Establish CI/CD pipelines, testing frameworks, and modern release practices. - Implement robust monitoring, alerting, and incident response processes to maintain system reliability. - Push innovation whether its using AI tools to accelerate dev cycles or improving data pipelines for underwriters. - Act as technical point of contact for both Albatross and Recognise Bank, collaborating across teams. You Should Have : - 7- 12 years of experience in full-stack or front-end development roles. - 3+ years leading engineering teams or mentoring junior developers. - Proven ability to uplift inexperienced teams and introduce structured engineering practices from scratch. - Deep familiarity with React and JavaScript/TypeScript (Node.js, Python, or Java on the back-end a plus, but not required). - A track record of shipping responsive, mobile-first web applications. - Experience designing and working with relational databases (e.g. PostgreSQL, MySQL) and understanding of data modeling for lending workflows. - Experience integrating third-party APIs (Open Banking, KYC/KYB, Credit Bureaus). - Solid grasp of Git, CI/CD, DevOps, and testing frameworks. - Familiarity with banking or fintech especially onboarding flows, CRMs, or SME lending. - Experience building platforms with multiple user types (e.g. brokers, SMEs, underwriters). - Understanding of secure development practices and sensitivity to data privacy in financial applications. - A strong sense of product able to work without rigid specs and propose better solutions. - Bachelors degree in Computer Science, Engineering, or a related field, or equivalent practical experience. - Bonus : Experience leveraging AI/automation tools (e.g. Claude, internal LLMs) to improve development velocity or reduce manual effort.
Posted 3 weeks ago
6.0 - 8.0 years
30 - 32 Lacs
Hyderabad
Work from Office
Requirements Minimum 5+ years of hands-on experience with Snowflake and a total of 9+ years of industry experience. Strong understanding of database concepts and data warehouse architectures. Expertise in SQL querying, ensuring efficient data manipulation and retrieval. Skilled in designing and implementing Snowflake cloud data warehouse architectures and data modeling. Experience in migration projects, transitioning on-premise systems to Snowflake. Comprehensive knowledge of Snowflake features, including Snowpipe, Stages, SnowSQL, Streams, and Tasks. Proficiency in advanced Snowflake concepts, such as resource monitoring, RBAC controls, virtual warehouse sizing, and Zero Copy Clone. In-depth understanding of data migration from RDBMS to Snowflake cloud environments. Ability to deploy Snowflake capabilities, such as data sharing, event-driven architectures, and lakehouse patterns. Well-versed in incremental extraction loads, handling both batch processing and real-time streaming.
Posted 3 weeks ago
8.0 - 10.0 years
11 - 18 Lacs
Surat
Work from Office
Role Responsibilities : - Design and implement data pipelines using MS Fabric. - Develop data models to support business intelligence and analytics. - Manage and optimize ETL processes for data extraction, transformation, and loading. - Collaborate with cross-functional teams to gather and define data requirements. - Ensure data quality and integrity in all data processes. - Implement best practices for data management, storage, and processing. - Conduct performance tuning for data storage and retrieval for enhanced efficiency. - Generate and maintain documentation for data architecture and data flow. - Participate in troubleshooting data-related issues and implement solutions. - Monitor and optimize cloud-based solutions for scalability and resource efficiency. - Evaluate emerging technologies and tools for potential incorporation in projects. - Assist in designing data governance frameworks and policies. - Provide technical guidance and support to junior data engineers. - Participate in code reviews and ensure adherence to coding standards. - Stay updated with industry trends and best practices in data engineering. Qualifications : - 8+ years of experience in data engineering roles. - Strong expertise in MS Fabric and related technologies. - Proficiency in SQL and relational database management systems. - Experience with data warehousing solutions and data modeling. - Hands-on experience in ETL tools and processes. - Knowledge of cloud computing platforms (Azure, AWS, GCP). - Familiarity with Python or similar programming languages. - Ability to communicate complex concepts clearly to non-technical stakeholders. - Experience in implementing data quality measures and data governance. - Strong problem-solving skills and attention to detail. - Ability to work independently in a remote environment. - Experience with data visualization tools is a plus. - Excellent analytical and organizational skills. - Bachelor's degree in Computer Science, Engineering, or related field. - Experience in Agile methodologies and project management.
Posted 3 weeks ago
13.0 - 17.0 years
22 - 30 Lacs
Pune
Hybrid
Primary Skills: SQL (Data Analysis and Development) Alternate Skills: Python, Sharepoint, AWS , ETL, Telecom specially Fixed Network domain. Location: Pune Working Persona: Hybrid Experience: 13 to 18 years Core competencies, knowledge and experience: Essential: - Strong SQL experience - Advanced level of SQL - Excellent data interpretation skills - Good knowledge of ETL and a business intelligence, good understanding of range of data manipulation and analysis techniques - Working knowledge of large information technology development projects using methodologies and standards - Excellent verbal, written and interpersonal communication skills, demonstrating the ability to communicate information technology concepts to non-technology personnel, should be able to interact with business team and share the ideas. - Strong analytical, problem solving and decision-making skills, attitude to plan and organize work to deliver as agreed. - Ability to work under pressure to tight deadlines. - Hands on experience working with large datasets. - Able to manage different stakeholders. Good to Have / Alternate Skills: - Strong coding experience in Python. Experience: - In-depth working experience in ETL. C2 General - Fixing problems in cooperation with internal and external partners (e.g. Service owner, Tech. Support Team, IT-Ops) - Designing and implementing the changes to the existing different components of data flow. - Develop & maintain end to end data flow. - Maintaining the data quality, data consistency issues and essential bus business-criticaliness critical processes - Conducting preventative maintenance of the systems - Drive system optimization and simplification - Responsible for performance of data flow and optimisation of the data preparation in conjunction with the other technical team
Posted 3 weeks ago
6.0 - 10.0 years
30 - 35 Lacs
Pune
Work from Office
Work mode – Currently this is remote but it’s not permanent WFH , Once need its WFO Mandatory:-Data development & engineering , azure data tools, DE , Python & Pyspark , Architect – 2 Years +, Data lake ,relational data basis (RDBMS), oracle & AWS Required Candidate profile Provide support & technical governance, offering expertise related to cloud architectures, deployment, & operations. Apply Agile & DevOps methodologies & implementation approaches in project delivery.
Posted 3 weeks ago
6.0 - 10.0 years
15 - 18 Lacs
Hyderabad, Bengaluru
Work from Office
Client is looking for a strong Java candidate with the following skills. Spring Webflux and streaming knowledge is a must and key thing that they are looking for. Here is the overall JD for the position. RDBMS No At least 1 year Is Required CI/CD 2-5 Years Is Required Cloud Computing 2-5 Years Is Required Core Java 5-10 Years Is Required Kubernetes -2-5 Years Is Required microservices - 2-5 Years Is Required MongoDB - At least 1 year Nice To Have NoSQL -At least 1 year Nice To Have python - At least 1 year Is Required Spring Boot - 5-10 Years Is Required Spring Data - 2-5 Years Is Required Spring Security - 2-5 Years Is Required Spring Webflux - At least 1 year Is Required Stream processing - At least 1 year Is Required Java 17 - 2-5 Years Is Required Apache Kafka - At least 1 year Is Required Apache SOLR - At least 1 year Is Required Expertise with solution design and enterprise large scale applications development In-depth knowledge of integration patterns, integration technologies and integration platforms Experience with Queuing related technologies like Kafka Good hands-on experience to design and build cloud ready application Good programming skills in Java, Python etc Proficiency of dev/build tools: git, maven, gradle Experience with the modern NoSQL/Graph DB/Data Streaming technologies is a plus Good understanding of Agile software development methodology Location: hyd,mangalore.bubaneshawr,trivendrum
Posted 3 weeks ago
1.0 - 5.0 years
9 - 13 Lacs
Chennai
Work from Office
Dear Aspirant! We empower our people to stay resilient and relevant in a constantly changing world. We're looking for people who are always searching for creative ways to grow and learn. People who want to make a real impact, now and in the future. Does that sound like you? Then it seems like you'd make a great addition to our vibrant international team. We are looking forSolution Engineer, You'll make an impact by Being adept in driving resolution of customer issues and bug fixes, acting as customer advocate while adhering to support service standards. Provide work estimates and regular status updates Active monitoring and improvement of the environment to ensure 100% application availability Deploy, configure, and test Smart Grid Application solutions. Be on-call on a weekly rotation basis throughout the year to help enable 24x7 global support. Escalate and log defects for more advanced customer issues to Product and Project Management teams when necessary. Ensure that SLAs of assigned customer reported issues are met within TAT Actively participate in team and departmental meetings by providing feedback on current day-to-day activity and recommendations for improvement. Use your skills to move the world forward! EducationBE degree in Computer/Electronics/Electrical Engineering or relevant field. Experience3-7 years Strong problem solving and software troubleshooting skills. Experience in core Java in application support. Understanding of ITIL concepts, ITIL certification preferred. Working knowledge of UNIX environment with understanding of Shell/python Scripting. Technical troubleshooting skills of large RDBMS like Oracle with sound knowledge of SQL. Knowledge and troubleshooting experience of core Java, kafka and XML standards to support complex enterprise software. Knowledge of basic networking concepts like WWW, FTP, TELNET, SSH, etc. Knowledge of web-enabled support tools like SOAP UI. Ability to effectively communicate technical issues, prioritize them and enjoy working with customers. Apache Tomcat Administration skills or Administration of other web applications. Excellent people and communication skills. Impeccable English speaking and writing skills. Excellent interpersonal skills and should be an active listener and learner. Create a better #TomorrowWithUs! This role is based in Chennai, where you'll get the chance to work with teams impacting entire cities, countries - and the shape of things to come. We're Siemens. A collection of over 312,000 minds building the future, one day at a time in over 200 countries. We're dedicated to equality, and we encourage applications that reflect the diversity of the communities we work in. All employment decisions at Siemens are based on qualifications, merit and business need. Bring your curiosity and imagination and help us shape tomorrow. Find out more about Siemens careers at www.siemens.com/careers Find out more about the Digital world of Siemens here www.siemens.com/careers/digitalminds
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The job market for RDBMS (Relational Database Management System) professionals in India is thriving, with a high demand for skilled individuals who can design, implement, and manage relational databases. Companies across various industries are actively seeking RDBMS experts to maintain their data infrastructure and ensure efficient data management.
These cities are known for their vibrant tech scenes and offer numerous opportunities for RDBMS professionals.
The average salary range for RDBMS professionals in India varies based on experience and expertise. Entry-level positions can expect to earn around INR 3-5 lakhs per annum, while experienced professionals can command salaries ranging from INR 8-15 lakhs per annum.
A typical career path in RDBMS starts with roles such as Junior Database Administrator or Database Developer. As professionals gain experience and expertise, they can progress to roles like Senior Database Administrator, Data Architect, or Database Manager. Eventually, experienced individuals may advance to positions such as Tech Lead or Database Architect.
In addition to RDBMS expertise, professionals in this field are often expected to have knowledge of: - SQL programming - Database design principles - Data modeling - Performance tuning - Database security
As you explore opportunities in the RDBMS job market in India, remember to showcase your expertise in relational databases and related skills during interviews. Prepare thoroughly, demonstrate your knowledge confidently, and you'll be well-positioned to land a rewarding career in this field. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.