Jobs
Interviews

60 Partitioning Jobs - Page 2

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 8.0 years

7 - 10 Lacs

Chennai

Work from Office

Hands-on experience in data modelling for both OLTP and OLAP systems. In-depth knowledge of Conceptual, Logical, and Physical data modelling. Strong understanding of indexing, partitioning, and data sharding with practical experience. Experience in identifying and addressing factors affecting database performance for near-real-time reporting and application interaction. Proficiency with at least one data modelling tool (preferably DB Schema). Functional knowledge of the mutual fund industry is a plus. Familiarity with GCP databases like Alloy DB, Cloud SQL, and Big Query. Willingness to work from Chennai (office presence is mandatory) Chennai customer site, requiring five days of on-site work each week. Mandatory Skills: Cloud-PaaS-GCP-Google Cloud Platform Experience: 5-8 Years

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

maharashtra

On-site

As a Data Engineer at Blis, you will be part of a globally recognized and award-winning team that specializes in big data analytics and advertising. We collaborate with iconic brands like McDonald's, Samsung, and Mercedes Benz, providing precise audience insights to help them target their ideal customers effectively. Upholding ethical data practices and privacy rights is at the core of our operations, and we are committed to ensuring outstanding performance and reliability in all our systems. Working at Blis means being part of an international company with a diverse culture, spanning across four continents and comprising over 300 team members. Headquartered in the UK, we are financially successful and poised for continued growth, offering you an exciting opportunity to contribute to our journey. Your primary responsibility as a Data Engineer will involve designing and implementing high-performance data pipelines on Google Cloud Platform (GCP) to handle massive amounts of data efficiently. With a focus on scalability and automation, you will play a crucial role in building secure pipelines that can process over 350GB of data per hour and respond to 400,000 decision requests each second. Your expertise will be instrumental in driving improvements in data architecture, optimizing resource utilization, and delivering fast, accurate insights to stakeholders. Collaboration is key at Blis, and you will work closely with product and engineering teams to ensure that our data infrastructure evolves to support new initiatives seamlessly. Additionally, you will mentor and support team members, fostering a collaborative environment that encourages knowledge sharing, innovation, and professional growth. To excel in this role, you should have at least 5 years of hands-on experience with large-scale data systems, with a strong focus on designing and maintaining efficient data pipelines. Proficiency in Apache Druid and Imply platforms, along with expertise in cloud-based services like GCP, is essential. You should also have a solid understanding of Python for building and optimizing data flows, as well as experience with data governance and quality assurance practices. Furthermore, familiarity with event-driven architectures, tools like Apache Airflow, and distributed processing frameworks such as Spark will be beneficial. Your ability to apply complex algorithms and statistical techniques to large datasets, along with experience in working with relational databases and non-interactive reporting solutions, will be valuable assets in this role. Joining the Blis team means engaging in high-impact work in a data-intensive environment, collaborating with brilliant engineers, and being part of an innovative culture that prioritizes client obsession and agility. With a global reach and a commitment to diversity and inclusion, Blis offers a dynamic work environment where your contributions can make a tangible difference in the world of advertising technology.,

Posted 3 weeks ago

Apply

4.0 - 9.0 years

5 - 9 Lacs

Noida

Work from Office

Job Title: Senior Draftsman Interior Detailing Location: Head Office Sector 127, Noida Company: Innovatiview Experience Required: 47 Years Type: Full-time Role & responsibilities Prepare precise and detailed working drawings for interiors including partitions, furniture layouts, ceiling designs, wall paneling, and flooring. Convert concept designs into clear, construction-ready drawings with minimal supervision. Develop detailed drawings for specialized interior spaces like control rooms, biometric zones, server rooms, and support areas. Coordinate with architectural, MEP, and site teams to ensure technical alignment and clarity Ensure drawing accuracy, proper layering, annotation, and adherence to drafting standards Preferred candidate profile Diploma or Degree in Architecture, Interior Design, or a related discipline 47 years of relevant experience in architectural/interior drafting, preferably in institutional or technical infrastructure projects Strong proficiency in AutoCAD; knowledge of Revit, SketchUp, or other 3D software is a plus Excellent knowledge of interior detailing, modular furniture, custom joinery, and service coordination Self-motivated and reliable, capable of delivering high-quality outputs without needing close supervision Familiarity with on-site conditions and construction practices is essential We are looking for a technically sound and detail-oriented individual who can plug into ongoing projects from day one. Prior experience in control rooms, secure exam centers, or similar institutional interiors will be a strong advantage.

Posted 1 month ago

Apply

1.0 - 3.0 years

4 - 9 Lacs

Hyderabad

Work from Office

Key Responsibilities: Create and maintain optimal data pipeline architecture. Assemble large, complex data sets that meet functional/non-functional business requirements. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. using Python/open source technologies. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and cloud database technologies. Work with stakeholders including the Executive, Product, Data, and Design teams to assist with data-related technical issues and support their data infrastructure needs. Keep our data separated and secure across national boundaries through multiple data centers and regions. Work with data and analytics experts to strive for greater functionality in our data systems. Test databases and perform bug fixes. Develop best practices for database design and development activities. Ability to quickly analyze existing SQL code and make improvements to enhance performance, take advantage of new SQL features, close security gaps, and increase robustness and maintainability of the code. Take on technical leadership responsibilities of database projects across various scrum teams Manage exploratory data analysis to support database and dashboard development Required Skills: Expert knowledge in databases like PostgreSQL (preferably cloud hosted in any one or more cloud offerings like AWS, Azure, GCP), and any cloud-based Data Warehouse (like Snowflake, Azure Synapse) with strong programming experience in SQL. Competence in data preparation and/or ETL tools like snapLogic, MATILLION, Azure Data Factory, AWS Glue, and SSIS (preferably strong working experience in one or more) to build and maintain data pipelines and flows. Understanding of data modeling techniques and working knowledge with OLTP and OLAP systems Deep knowledge of databases, stored procedures, optimizations of huge data In-depth knowledge of ingestion techniques, data cleaning, de-dupe, and partitioning. Experience with building the infrastructure required for data ingestion and analytics Ability to fine-tune report-generating queries Solid understanding of normalization and denormalization of data, database exception handling, transactions, profiling queries, performance counters, debugging, database & query optimization techniques Understanding of index design and performance-tuning techniques Familiarity with SQL security techniques such as data encryption at the column level, Transparent Data Encryption (TDE), signed stored procedures, and assignment of user permissions Experience in understanding the source data from various platforms and mapping them into Entity Relationship Models (ER) for data integration and reporting Adhere to standards for all databases e.g., Data Models, Data Architecture, and Naming Conventions Exposure to Source control like GIT, Azure DevOps Understanding of Agile methodologies (Scrum, Kanban) Preferably experience with NoSQL database to migrate data into other types of databases with real-time replication. Experience with automated testing and coverage tools Experience with CI/CD automation tools (desirable)

Posted 1 month ago

Apply

5.0 - 10.0 years

8 - 12 Lacs

Pune

Work from Office

5+ years of overall experience with min 2 years of experience in Regulatory Reporting with exposure to the Axiom. Implementation experience preferably in the Banking domain. Transformation of business requirements into technical needs and implementation procedures. Ability to work independently. Good communication skills. Experience in Software Design and Development. Primarily working in the Data Warehousing and Regulatory Reporting area in Banking and Finance domain Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Should have good experience on simple and complex sql query writing using all type of joins, analytical functions, with clause, sub query, set operators. Very well verse and should have good exposure with Oracle SQL Loader, Partitioning, performance tuning. Should have good hands-on oracle procedural language like PL/SQL procedure, function, packages, views, temporary tables, collections, etc. Preferred to have knowledge on UNIX, shell scripting, Autosys jobs Preferred technical and professional experience Experience with Axiom Controller View Regulatory Reporting tool - version 9x/10x. Experience with Axiom implementation for different Regulations

Posted 1 month ago

Apply

6.0 - 9.0 years

27 - 42 Lacs

Pune

Work from Office

About the role As a Big Data Engineer, you will make an impact by identifying and closing consulting services in the major UK banks. You will be a valued member of the BFSI team and work collaboratively with manager, primary team and other stakeholders in the unit. In this role, you will: Collaborate with cross-functional teams to improve data ingestion, transformation, and validation workflows Work closely with Data Engineers, Architects, and Analysts to understand data reconciliation requirements Develop and implement PySpark programs to process large datasets in Big data platforms Analyze and comprehend existing data ingestion and reconciliation frameworks Perform complex transformations including reconciliation and advanced data manipulations Fine-tune Spark jobs for performance optimization, ensuring efficient data processing at scale Work model We believe hybrid work is the way forward as we strive to provide flexibility wherever possible. Based on this role’s business requirements, this is a hybrid position requiring 3 days a week in a client or Cognizant office in Pune/Hyderabad location. Regardless of your working arrangement, we are here to support a healthy work-life balance though our various wellbeing programs. What you must have to be considered Design and implement data pipelines, ETL processes, and data storage solutions that support data-intensive applications Extensive hands-on experience with Python, PySpark Good at Data Warehousing concepts & well versed with structured, semi structured (Json, XML, Avro, Parquet) data processing using Spark/Pyspark data pipelines Experience working with large-scale distributed data processing, and solid understanding of Big Data architecture and distributed computing frameworks Proficiency in Python and Spark Data Frame API, and strong experience in complex data transformations using PySpark These will help you stand out Able to leverage Python libraries such as cryptography or pycryptodome along with PySpark's User Defined Functions (UDFs) to encrypt and decrypt data within your Spark workflows Should have worked on Data risk metrics in PySpark & Excellent at Data partitioning, Z-value generation, Query optimization, spatial data processing and optimization Experience with CI/CD for data pipelines Must have working experience in any of the cloud environment AWS/Azure/GCP Proven experience in an Agile/Scrum team environment Experience in development of loosely coupled API based systems We're excited to meet people who share our mission and can make an impact in a variety of ways. Don't hesitate to apply, even if you only meet the minimum requirements listed. Think about your transferable experiences and unique skills that make you stand out as someone who can bring new and exciting things to this role.

Posted 1 month ago

Apply

3.0 - 8.0 years

6 - 10 Lacs

Noida, Hyderabad, Bengaluru

Work from Office

Skills SOC Physical Design Experience 3 to 15 years Job Location Bangalore, Hyderabad, Noida, and Coimbatore. Job Description: Location: Bangalore, Hyderabad, Noida, and Coimbatore. Skills: Soc level floorplanning, partitioning, timing budget generations, power planning, SOC PnR, CTS, block integration Handling timing closure of high frequency blocks. Expertise in signoff closure Timing with SI and OCV, Power, IR and physical verification at both block and chip level. Understanding constraints and fixing techniques. Experience in physical verification Understanding SI prevention, fixing methodology and implementation. Proficient in Synopsys ICC or Cadence or Mentor Olympus and Atoptech tool set. Experience in Design Automation and UNIX system. Experience in Tcl/ PERL is a plus. Primary Skills: Able to handle Soc PNR activities , SOC timing closure and SOC physical verification Secondary Skills: Able to handle SOC Synthesis, SOC IR drop, SOC Lec, SOC CLP

Posted 1 month ago

Apply

4.0 - 6.0 years

4 - 8 Lacs

Pune

Work from Office

Immediate job opening for # SFI Vlocity_C2H_Pan India. Skill: MS SQL DB, Unix Exp: 4 Years to 7 Years Location: Pune Job description: Must Have: Expert in MSSQL database ( SQL queries, tables, index, stored procedure, partition, replication, failover etc), Unix Shell Scripting, Windows Server Good to Have: Monitoring tools ELK/AppDynamics, SSM JD: - Expert in alerting and Monitoring tools like ELK, AppDynamics, SSM etc - Hands on Automation using PowerShell script, UNIX script etc - Expert in MSSQL database ( SQL queries, tables, index, stored procedure, partition, replication, failover etc) - Good understanding on network concepts like load balancer, VIP, Pool members etc - Good understanding on certificates and how they should be used to protect customer data and meet banks regulatory requirements. - Quick self-learner and should be flexible and adaptable to learn new technologies. - Should have engineering mind set and should have good understanding on the retail loan business that runs on platform windows/java/MSSQL platform.

Posted 1 month ago

Apply

5.0 - 7.0 years

5 - 7 Lacs

Mumbai, Maharashtra, India

On-site

Key Responsibilities: Administer and maintain production Postgres databases hosted on AWS-RDS Set up and manage database replication , partitioning , and synchronization between primary and secondary databases Write and optimize complex SQL queries for performance and scalability Automate database maintenance tasks and monitoring Ensure high availability, data integrity, and performance across all environments Collaborate with cross-functional teams while contributing as an individual performer Candidate Requirements: Hands-on experience with Postgres and AWS-RDS in production environments Proficient in SQL query writing , debugging, and optimization Practical knowledge of replication , partitioning , and automation tools Strong ownership mindset with the ability to handle ambiguity and drive clarity Agile, self-driven, and committed to continuous learning Immediate joiners preferred

Posted 1 month ago

Apply

5.0 - 10.0 years

7 - 11 Lacs

Bengaluru

Work from Office

We are seeking a skilled Informatica BDM Engineer with a strong background in Big Data Management to join our team. The ideal candidate will be responsible for designing, developing, and implementing Informatica solutions to manage and analyze large volumes of data effectively. You will play a crucial role in ensuring data integrity, security, and compliance within our organization. Overall Responsibilities Design and Development: Design, develop, and implement solutions using Informatica Big Data Management. Ensure quality and performance of technical and application architecture and design across the organization. Data Management: Work extensively with Oozie scheduling, HQL, Hive, HDFS, and data partitioning to manage large datasets. Collaborate with teams to ensure effective data integration and transformation processes using SQL and NoSQL databases. Security Implementation: Design and implement security systems, identifying gaps in existing architectures and recommending enhancements. Adhere to established policies and best practices regarding data security and compliance. Monitoring and Troubleshooting: Actively monitor distributed services and troubleshoot issues in production environments. Implement resiliency and monitoring solutions to ensure continuous service availability. Agile and DevOps Practices: Participate in Agile methodology, ensuring timely delivery of projects while adhering to CI/CD principles using tools like GitHub and Jenkins. Collaboration and Influence: Work collaboratively with multiple teams to share knowledge and improve productivity. Effectively research and benchmark technologies against best-in-class solutions. Technical Skills Core Skills Informatica BDM: Minimum 5 years of development and design experience. Data Technologies: Extensive knowledge of Oozie, HQL, Hive, HDFS, and data partitioning. Databases: Proficient in SQL and NoSQL databases. Operating Systems: Strong Linux OS configuration skills, including shell scripting. Security and Compliance: Knowledge of designing security controls for data transfers and ETL processes. Understanding of compliance and regulatory requirements, including encryption and data integrity. Networking Basic understanding of networking concepts including DNS, Proxy, ACL, and policy troubleshooting. DevOps & Agile: Familiar with Agile methodologies, CI/CD practices, and tools (GitHub, Jenkins). Experience with distributed services resiliency and monitoring. Experience Minimum 5 years of experience in Informatica Big Data Management. Experience in the Banking, Financial, and Fintech sectors preferred. Proven ability to implement design patterns and security measures in large-scale infrastructures. Qualifications Education: Bachelors or Masters degree in Computer Science or a related field (or equivalent industry experience). Soft Skills Excellent interpersonal and communication skills to effectively present ideas to teams and stakeholders. Strong listening skills with the ability to speak clearly and confidently in front of management and peers. Positive attitude towards work, fostering a climate of trust and collaboration within the team. Enthusiastic and passionate about technology, creating a motivating environment for the team. S YNECHRONS DIVERSITY & INCLUSION STATEMENT Diversity & Inclusion are fundamental to our culture, and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity, Equity, and Inclusion (DEI) initiative Same Difference is committed to fostering an inclusive culture promoting equality, diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger, successful businesses as a global company. We encourage applicants from across diverse backgrounds, race, ethnicities, religion, age, marital status, gender, sexual orientations, or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements, mentoring, internal mobility, learning and development programs, and more. All employment decisions at Synechron are based on business needs, job requirements and individual qualifications, without regard to the applicants gender, gender identity, sexual orientation, race, ethnicity, disabled or veteran status, or any other characteristic protected by law . Candidate Application Notice

Posted 1 month ago

Apply

10.0 - 15.0 years

20 - 35 Lacs

Bengaluru

Hybrid

Greetings from Infogain! We are having Immediate requirement for DB Architect in Infogain India Pvt Ltd. We are looking for candidate who have good experience in Oracle PL/SQL & postgreSQL . Please find the below details about the position. Skills : Oracle PL/SQL & postgreSQL Experience : 10 to 15 years Location: Bangalore Notice period- Less than 30 Days Mode of work- Hybrid (twice in week) Job Summary: The SQL architect is responsible for designing and implementing database structures that meet the needs of an organization. They have a deep understanding of database technologies, performance tuning, and can work with stakeholders to translate business requirements into technical solutions. an share CV @ arti.sharma@infogain.com Total Exp Experience Relevant Exp in Oracle PL/SQL Relevant Experience in postgreSQL Relevant Exp in Stored Procedures, Cursors, Functions , CTE, Optimization Relevant Exp in partitioning Relevant exp in Solution Designing Current location - Preferred JOB Location- -

Posted 1 month ago

Apply

9.0 - 13.0 years

32 - 40 Lacs

Ahmedabad

Remote

About the Role: We are looking for a hands-on AWS Data Architect OR Lead Engineer to design and implement scalable, secure, and high-performing data solutions. This is an individual contributor role where you will work closely with data engineers, analysts, and stakeholders to build modern, cloud-native data architectures across real-time and batch pipelines. Experience: 715 Years Location: Fully Remote Company: Armakuni India Key Responsibilities: Data Architecture Design: Develop and maintain a comprehensive data architecture strategy that aligns with the business objectives and technology landscape. Data Modeling: Create and manage logical, physical, and conceptual data models to support various business applications and analytics. Database Design: Design and implement database solutions, including data warehouses, data lakes, and operational databases. Data Integration: Oversee the integration of data from disparate sources into unified, accessible systems using ETL/ELT processes. Data Governance: Implemented enforce data governance policies and procedures to ensure data quality, consistency, and security. Technology Evaluation: Evaluate and recommend data management tools, technologies, and best practices to improve data infrastructure and processes. Collaboration: Work closely with data engineers, data scientists, business analysts, and other stakeholders to understand data requirements and deliver effective solutions. Trusted by the worlds leading brands Documentation: Create and maintain documentation related to data architecture, data flows, data dictionaries, and system interfaces. Performance Tuning: Optimize database performance through tuning, indexing, and query optimization. Security: Ensure data security and privacy by implementing best practices for data encryption, access controls, and compliance with relevant regulations (e.g., GDPR, CCPA) Required Skills: Helping project teams with solutions architecture, troubleshooting, and technical implementation assistance. Proficiency in SQL and database management systems (e.g., MySQL, PostgreSQL, Oracle, SQL Server). Minimum7to15 years of experience in data architecture or related roles. Experience with big data technologies (e.g., Hadoop, Spark, Kafka, Airflow). Expertise with cloud platforms (e.g., AWS, Azure, Google Cloud) and their data services. Knowledge of data integration tools (e.g., Informatica, Talend, FiveTran, Meltano). Understanding of data warehousing concepts and tools (e.g., Snowflake, Redshift, Synapse, BigQuery). Experience with data governance frameworks and tools.

Posted 1 month ago

Apply

4.0 - 6.0 years

0 Lacs

Hyderabad / Secunderabad, Telangana, Telangana, India

Remote

Req ID: 328599 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a BODS Developer to join our team in Hyderabad, Telangana (IN-TG), India (IN). Understand and execute data migration blueprints (migration concepts, transformation rules, mappings, selection criteria) Understand and contribute to the documentation of the data mapping specifications, conversion rules, technical design specifications as required Build the conversion processes and associated programs that will migrate the data per the design and conversion rules that have been signed-off by the client Execution of all data migration technical steps (extract, transform & load) as well as Defect Management and Issue Resolution Perform data load activities for each mock load, cutover simulation and production deployment identified in L1 plan into environments identified Provide technical support, defect management, and issue resolution during all testing cycles, including Mock Data Load cycles Complete all necessary data migration documentation necessary to support system validation / compliance requirements Support the development of unit and end-to-end data migration test plans and test scripts (including testing for data extraction, transformation, data loading, and data validation) Job Requirements Work in shift from 3 PM to 12 AM IST with Work from home option 4-6 Yrs. of overall technical experience in SAP BODS with all the SAP BODS application modules (Extract, Transform, Load) 2-4 Yrs. of experience with Data Migration experience with S/4 HANA/ECC Implementations Experience in BODS Designer Components- Projects, Jobs, Workflow, Data Flow, Scripts, Data Stores and Formats Experience in BODS performance tuning techniques using parallel processing (Degree of Parallelism), Multithreading, Partitioning, and Database Throughputs to improve job performance Experience in ETL using SAP BODS and SAP IS with respect to SAP Master / Transaction Data Objects in SAP FICO, SAP SD, SAP MM/WM, SAP Plant Maintenance, SAP Quality Management etc. is desirable Experience with Data Migration using LSMW, IDOCS, LTMC Ability to Debug LTMC errors is highly desirable About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at NTT DATA endeavors to make accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click . If you'd like more information on your EEO rights under the law, please click . For Pay Transparency information, please click.

Posted 1 month ago

Apply

8.0 - 13.0 years

25 - 35 Lacs

Gurugram

Work from Office

Title: Senior Database Architect - Microsoft SQL Server Location: Gurgaon, India Type: Hybrid (work from office) Job Description Who We Are: Fareportal is a travel technology company powering a next-generation travel concierge service. Utilizing its innovative technology and company owned and operated global contact centers, Fareportal has built strong industry partnerships providing customers access to over 600 airlines, a million lodgings, and hundreds of car rental companies around the globe. With a portfolio of consumer travel brands including CheapOair and OneTravel, Fareportal enables consumers to book-online, on mobile apps for iOS and Android, by phone, or live chat. Fareportal provides its airline partners with access to a broad customer base that books high-yielding international travel and add-on ancillaries. Fareportal is one of the leading sellers of airline tickets in the United States. We are a progressive company that leverages technology and expertise to deliver optimal solutions for our suppliers, customers, and partners. FAREPORTAL HIGHLIGHTS: Fareportal is the number 1 privately held online travel company in flight volume. Fareportal partners with over 600 airlines, 1 million lodgings, and hundreds of car rental companies worldwide. 2019 annual sales exceeded $5 billion. Fareportal sees over 150 million unique visitors annually to our desktop and mobile sites. Fareportal, with its global workforce of over 2,600 employees, is strategically positioned with 9 offices in 6 countries and headquartered in New York City. Role Overview : We are looking for a highly skilled and enthusiastic Sr. Database Architect/Database Administrator. The responsibility of this individual will be focused on database design, data retention processes, data compliance, security, and automating related processes. We expect you to be proficient with automation and have expert-level knowledge of at least one database technology in our portfolio. This position requires a committed and self-motivated individual who can create and maintain data and schema designs for critical business processes, analytical projects and reporting infrastructure. You will help maintain our SQL reporting pipelines and procedures, their monitoring, as well as building out our data mart infrastructure. The right applicant will possess strong skills in latest technology and features used like SQL Server 2016+, Replication, Clustering, and Always-On. We are looking for individuals who are eager to learn and work in fast paced environment. Responsibilities: Manage and lead a team of DBAs in daily database operations. Design and develop scalable, efficient database architectures that align with business goals. Ensure the availability, reliability, and performance of SQL Server databases. Monitor and optimize database performance, proactively identifying and resolving issues. Implement backup and recovery procedures to ensure minimize data loss, with a strong focus on disaster recovery strategies. Develop and implement proactive monitoring and alerting systems to identify and address potential issues before they become critical. Configure and maintain database replication and high availability solutions, automating processes where possible. Manage database security, including user access, roles, and permissions. Develop and maintain comprehensive documentation for database systems, including standard operating procedures, policies, and standards. Collaborate with database engineers to ensure consistent and effective database design and implementation. Design and implement robust data models and schemas that support business requirements and facilitate data integrity. Implement automation for database deployment, maintenance, and monitoring tasks to enhance operational efficiency. Stay up-to-date with the latest SQL Server technologies and industry trends. Required Skills & Qualifications : Bachelor's degree in Computer Science or a related field. 8-10 years of experience in Microsoft SQL Server DBA operations. Experience with SQL Server 2016, 2017, 2019, and 2022 in Windows and Linux environments. Proficiency in database architecture skills, including: Data modeling and schema design, Indexing strategies and performance tuning and Partitioning. Strong automation skills, including, scripting for database deployment and maintenance, automating backup and recovery processes, Implementing CI/CD pipelines for database changes Strong understanding of SQL Server internals, including execution plans, Always On, replication, backup and recovery, and performance optimization skills. Experience in team leadership and mentoring. Excellent communication and collaboration skills. Strong analytical and problem-solving skills. Ability to work independently and in a team environment. Disclaimer This job description is not designed to cover or contain a comprehensive listing of activities, duties or responsibilities that are required of the employee. Fareportal reserves the right to change the job duties, responsibilities, expectations or requirements posted here at any time at the Companys sole discretion, with or without notice.

Posted 1 month ago

Apply

8.0 - 13.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Your Role and Responsibilities Understand and integrate new technologies such as Software-Defined Networking and Network Functions Virtualization to cloud platforms with skills in the areas of Virtual switches, Network overlay technologies, Overlay technologies, Physical NIC drivers and selective leverage offloads a software interfaces Linux based driver ecosystem development and integration. Work with kernel networking subsystems such as iptables, sockets Need to have core network troubleshooting skills, response with urgency on incidents, perform root cause analysis, look out for the patterns, build knowledge base. You will be developing solutions for SDN infrastructure on Z server platform You will be expected to work with global teams and independently own the areas of responsibility. Expected to participate in scrums, sprint planning and retrospectives and be an active member of the team providing feedback to improve as needed. You will be expected to work collaboratively with the team, learn new technologies and apply the skills learned. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 8+ years of Working experience with Linux distributions (Ubuntu/RHEL) in a production environment. Programming Language: GoLang Shell Scripting REST API with backend application development Specific Skills: Strong knowledge of Containers (for eg. docker / Podman etc) & related technologies like container registries, Dockerfiles, creating container images, deploying containers Strong working knowledge of Container Orchestration (Kubernetes) Strong knowledge of Linux basics including packing, package managers, working with system services, distro specific development, building source code into distro specific packages, pkg installation Knowledge of working with Linux distros like Ubuntu Good knowledge of Virtualization like hypervisor, Virtual Machines, Bare Metal, Partitions etc and infrastructure/system management. Good to have basic understanding Networking basics Storage basics (including basic understanding of disks, volumes, SAN, Fabric, Storage subsystems etc) Continuous Integration and Deployment (CI/CD) - basic understanding of how code is built and deployed in a continuousmanner Strong English communication skills both written and Verbal Preferred technical and professional experience General understanding of private /public / hybrid cloud concepts General understanding of HW servers and server components General understanding of open source projects; experience with open sourcecommunity contribution can be an added advantage Storage basics (including basicunderstanding of disks, volumes, SAN, Fabric, Storage subsystems etc) Security basics (could include basic understanding of identity mgmt/authentication,authorization, firewall, auditing, secure communication, managing certificates,password management etc) Logging/Monitoring (basic understanding and application)

Posted 1 month ago

Apply

1.0 - 3.0 years

3 - 6 Lacs

Chennai

Work from Office

Skill Set Required GCP, Data Modelling (OLTP, OLAP), indexing, DBSchema, CloudSQL, BigQuery Data Modeller - Hands-on data modelling for OLTP and OLAP systems. In-Depth knowledge of Conceptual, Logical and Physical data modelling. Strong understanding of Indexing, partitioning, data sharding with practical experience of having done the same. Strong understanding of variables impacting database performance for near-real time reporting and application interaction. Should have working experience on at least one data modelling tool, preferably DBSchema. People with functional knowledge of the mutual fund industry will be a plus. Good understanding of GCP databases like AlloyDB, CloudSQL and BigQuery.

Posted 1 month ago

Apply

6.0 - 10.0 years

3 - 6 Lacs

Chennai

Work from Office

Job Information Job Opening ID ZR_2412_JOB Date Opened 04/02/2025 Industry IT Services Job Type Work Experience 6-10 years Job Title Data Modeller City Chennai Province Tamil Nadu Country India Postal Code 600001 Number of Positions 1 Skill Set Required GCP, Data Modelling (OLTP, OLAP), indexing, DBSchema, CloudSQL, BigQuery Data Modeller - Hands-on data modelling for OLTP and OLAP systems. In-Depth knowledge of Conceptual, Logical and Physical data modelling. Strong understanding of Indexing, partitioning, data sharding with practical experience of having done the same. Strong understanding of variables impacting database performance for near-real time reporting and application interaction. Should have working experience on at least one data modelling tool, preferably DBSchema. People with functional knowledge of the mutual fund industry will be a plus. Good understanding of GCP databases like AlloyDB, CloudSQL and BigQuery. check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested

Posted 1 month ago

Apply

10.0 - 12.0 years

9 - 12 Lacs

Bengaluru

Work from Office

Job Information Job Opening ID ZR_2105_JOB Date Opened 08/02/2024 Industry Technology Job Type Work Experience 10-12 years Job Title SW Architecture City Bangalore North Province Karnataka Country India Postal Code 560002 Number of Positions 4 Educational Qualifications B.E/B.Tech/M.E/M.Tech in Electronics / Communication / Electrical / Power ElecExperience: 9-11 years of relevant work experience in automotive BSW software developmenMajor Skills and Experience: Strong Hands-on experience in defining software architecture considering all lay Demonstrated Exemplary expertise in Real-time Embedded SW development us Strong Hands-on experience in low level driver SW development for 16 & 32 bit Excellent understanding of AUTOSAR configuration tools and methodology with Expertise in deriving SW Architecture w/ Safety Partitioning, ASIL Allocation to S Strong development experience in serial communication protocols CAN/CANF Expertise in cross compilers and debuggers like Lauterbach, Good experience in Expertise in configuration & Change management tools such as Plastic SCM, JIRA, Design tools such as EA/Visio, unit testing tools such as CANTATA/VectorCAST/RTRT, Serial Comm testing tool such as CANoe/CANalyzer/NeoVI Fire check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested

Posted 1 month ago

Apply

1.0 - 6.0 years

4 - 6 Lacs

Hyderabad

Work from Office

SUMMARY Mason Job Description We are seeking a skilled Mason to join our team in constructing, repairing, and maintaining various structures using masonry materials. The ideal candidate should possess precision, strength, and a deep understanding of building techniques and materials. Key Responsibilities: Interpret blueprints, drawings, and specifications. Lay bricks, concrete blocks, and other building blocks in mortar. Shape bricks and stones to fit specific spaces. Mix mortar or grout and apply it to surfaces. Construct and repair walls, partitions, arches, fireplaces, chimneys, and other structures. Utilize hand and power tools to cut and shape materials. Ensure structures are level, plumb, and square. Clean surfaces and remove excess mortar. Collaborate with other construction professionals to complete projects. Adhere to safety standards and regulations at all times. Requirements Requirements: Education Background: Open to all categories Experience: Minimum 0.6 months experience as a Mason Age: No restrictions Visa Type: Work Visa Language: Basic English proficiency Benefits Relocation Support: Free Visa . Free furnished shared accommodation will be provided. Daily travel to work will be covered. International Work Experience: Boost your resume with Dubai industry expertise. Limited openings! Apply now to meet an employer for interview and migrating to Dubai!

Posted 2 months ago

Apply

10.0 - 20.0 years

15 - 25 Lacs

Hyderabad

Work from Office

Maddisoft has the following immediate opportunity, let us know if you or someone you know would be interested. Send in your resume ASAP. send in resume along with LinkedIn profile without which applications will not be considered. Job Title: Senior Oracle PL/SQL developer Location : Hyderabad, India Job Description: Design, develop, test, maintain, and support batch applications using Oracle PL/SQL for Retail Commissions and Amortization. Essential Duties/Responsibilities: Develop Oracle PL/SQL applications. Code packages, stored procedures, functions, objects, tables, views, synonyms based on requirements. Understand functionality of existing applications and either incorporate new functionality to automate the calculations process or build new applications for the calculations process. Work with IT Technical Lead to understand requirements/technical design, understand the source tables and relationships, adhere to consistent coding/SVN standards, do code reviews, and provide daily status updates. Work with the Commissions analyst to understand the existing calculations process for each workstream and build, test, and UAT according to their expectations. Use Oracle Analytical functions, merge statements, WITH clause, oracle partitioning, job logging, exception handling, performance tuning and so on in the Oracle PL/SQL code. Optimize code by following performance tuning techniques considering limitations of current environment. Unit test, System test, Integration test, and Parallel test of the Oracle PL/SQL code. Use analytics skills to troubleshoot and resolve problems quickly Use SVN to maintain all database object scripts/source code. Follow versioning standards established by the team. Have code reviewed with Technical lead prior to moving the code to QA environment Create ControlM/Redwood job documents, create Change requests, and work with ControlM/Redwood to get them created and scheduled. Use ServiceNow ticketing system for opening, handling, and closing tickets and change requests. Follow SOX procedures for code deployment. Perform production support. Monitor batch processes daily/weekly/monthly and resolve job failures as quickly as possible. Work with ControlM/Redwood group on re-run/re-start instructions. Be willing to perform this off hours during weekdays and weekends. Fulfill ad-hoc requests for data pulls, updates or minor enhancements. Work effectively in a team environment. Develop strong working relationship with team members, Manager, Commissions team analysts, DBA team, Project Manager, and other IT groups. Independently work on tasks without step by step Supervision Communicate progress or issues clearly either verbally or through email. Escalate issues or delays that would impact deadline as early in the process as possible. Education: Bachelors degree in computer science, software engineering or relevant business discipline from an accredited four-year college or university or equivalent work experience. Experience: 15+ years of experience in Software development 12+ years of strong development experience using Oracle PL/SQL including experience using features like Analytical functions, Merge statements, WITH clause, Table partitioning, Job logging, Exception handling, and Performance tuning 12+ years of experience in Oracle 12c or 19c 5+ years of experience in a Data Warehouse environment Strong Analytical skills to analyze and resolve data discrepancies Troubleshooting skills Strong Team Player

Posted 2 months ago

Apply

6.0 - 11.0 years

0 - 1 Lacs

Chennai, Bengaluru, Mumbai (All Areas)

Hybrid

Job Title: PLSQL Developer (Performance Tuning / Partitioning) Experience: 6 to 12 years Location: Mumbai, Chennai, Pune, Bangalore, Noida, Coimbatore Work Mode: Hybrid Job Description: We are hiring a PLSQL Developer with strong experience in performance tuning and partitioning . The role involves writing and optimizing PL/SQL code, handling large data sets, and improving database performance. Key Skills: Strong PL/SQL development skills Experience with performance tuning and SQL optimization Knowledge of table partitioning Understanding of Oracle database Good problem-solving skills Email your resume to dipesh.patil@prodcon.com Cheers & warm regards, Dipesh Patil

Posted 2 months ago

Apply

5.0 - 7.0 years

7 - 9 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

As a Database Administrator (DBA), you will be responsible for managing production servers, ensuring performance, automation, partitioning, replication, and sync setups across primary/secondary databases Work with technologies like Postgres and AWS-RDS Bring strong SQL query writing skills and contribute as an individual performer while being a collaborative team player The role requires immediate joining, with a focus on dynamic execution, ambiguity management, and continuous learning Location: Chennai,Hyderabad, Kolkata, Pune, Ahmedabad, Delhi, Mumbai, Bengaluru, Pan India

Posted 2 months ago

Apply

4.0 - 6.0 years

7 - 8 Lacs

Gurugram

Work from Office

Job Title: Ab Initio Developer Location: Gurugram Experience: 4-5 years Employment Type: Full Time Job Summary: We are seeking an experienced Ab Initio Developer to design, develop, and maintain high-volume, enterprise-grade ETL solutions for our data warehouse environment. The ideal candidate will have strong technical expertise in Ab Initio components, SQL, UNIX scripting, and the ability to work collaboratively with both business and technical teams to deliver robust data integration solutions. Key Responsibilities: Analyze, design, implement, and maintain large-scale, multi-terabyte data warehouse ETL applications that operate 24/7 with high performance and reliability. Develop logical and physical data models to support data warehousing and business intelligence initiatives. Lead and participate in complex ETL development projects using Ab Initio, ensuring quality and efficiency. Translate business requirements into system and data flows, mappings, and transformation logic. Create detailed design documentation, including high-level (HLD) and low-level design (LLD) specifications. Conduct design reviews, capture feedback, and facilitate additional sessions as required. Develop, test, and deploy ETL workflows using Ab Initio components such as Rollup, Scan, Join, Partition, Gather, Merge, Interleave, Lookup, etc. Perform SQL database programming and optimize SQL queries for performance. Develop and maintain UNIX shell scripts to automate ETL workflows and system processes. Collaborate with Release Management, Configuration Management, Quality Assurance, Architecture, Database Support, and other development teams. Ensure adherence to source control standards using EME or similar tools. Provide ongoing support and maintenance of ETL processes and troubleshoot issues as needed. Required Skills & Qualifications: Hands-on development experience with Ab Initio components (Rollup, Scan, Join, Partition by key, Round Robin, Gather, Merge, Interleave, Lookup, etc.) Strong background in designing and delivering complex, large-volume data warehouse applications Experience with source-code control tools such as EME Proficient in SQL database programming, including query optimization and performance tuning Good working knowledge of UNIX scripting and Oracle SQL/PL-SQL Strong technical expertise in preparing detailed design documents (HLD, LLD) and unit testing Ability to understand and communicate effectively with both business and technical stakeholders Strong problem-solving skills and attention to detail Ability to work independently as well as part of a team

Posted 2 months ago

Apply

4.0 - 6.0 years

7 - 8 Lacs

Gurugram

Work from Office

Job Title: Ab Initio Developer Location: Gurugram Experience: 4-5 years Employment Type: Full Time Job Summary: We are seeking an experienced Ab Initio Developer to design, develop, and maintain high-volume, enterprise-grade ETL solutions for our data warehouse environment. The ideal candidate will have strong technical expertise in Ab Initio components, SQL, UNIX scripting, and the ability to work collaboratively with both business and technical teams to deliver robust data integration solutions. Key Responsibilities: Analyze, design, implement, and maintain large-scale, multi-terabyte data warehouse ETL applications that operate 24/7 with high performance and reliability. Develop logical and physical data models to support data warehousing and business intelligence initiatives. Lead and participate in complex ETL development projects using Ab Initio, ensuring quality and efficiency. Translate business requirements into system and data flows, mappings, and transformation logic. Create detailed design documentation, including high-level (HLD) and low-level design (LLD) specifications. Conduct design reviews, capture feedback, and facilitate additional sessions as required. Develop, test, and deploy ETL workflows using Ab Initio components such as Rollup, Scan, Join, Partition, Gather, Merge, Interleave, Lookup, etc. Perform SQL database programming and optimize SQL queries for performance. Develop and maintain UNIX shell scripts to automate ETL workflows and system processes. Collaborate with Release Management, Configuration Management, Quality Assurance, Architecture, Database Support, and other development teams. Ensure adherence to source control standards using EME or similar tools. Provide ongoing support and maintenance of ETL processes and troubleshoot issues as needed. Skills and Qualifications 4-5 years of experience in Ab Initio development. Ab Initio: Proficient in using Ab Initio tools, such as GDE and Enterprise Metadata Environment (EME). ETL Concepts: Understanding of ETL processes, data transformations, and data warehousing. SQL: Knowledge of SQL for data retrieval and manipulation. Unix/Linux Shell Scripting: Familiarity with Unix/Linux shell scripting for automation and scripting tasks. Problem-Solving: Ability to identify and solve technical issues related to Ab Initio application

Posted 2 months ago

Apply

3.0 - 6.0 years

14 - 18 Lacs

Bengaluru

Work from Office

Understand and integrate new technologies such as Software-Defined Networking and Network Functions Virtualization to cloud platforms with skills in the areas of Virtual switches, Network overlay technologies, Overlay technologies, Physical NIC drivers and selective leverage offloads a software interfaces Linux based driver ecosystem development and integration. Work with kernel networking subsystems such as iptables, sockets Need to have core network troubleshooting skills, response with urgency on incidents, perform root cause analysis, look out for the patterns, build knowledge base. You will be developing solutions for SDN infrastructure on Z server platform You will be expected to work with global teams and independently own the areas of responsibility. Expected to participate in scrums, sprint planning and retrospectives and be an active member of the team providing feedback to improve as needed. You will be expected to work collaboratively with the team, learn new technologies and apply the skills learn Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Programming Language: GoLang, REST API with backend application development. Advanced Networking KnowledgeProficiency in Software Defined Network (SDN), network design, protocols, and infrastructure, including LAN/WAN, SDN, and cloud networking. Familiarity with IPv4/IPv6, QoS, MPLS, and BGP Strategic Infrastructure PlanningCandidate should have the ability to design scalable and resilient network architectures, plan for capacity, and ensure disaster recovery and business continuity. Security ExpertiseBasic knowledge of cybersecurity principles to protect network integrity and data privacy. Operating SystemsUnderstanding how different operating systems affect network design. Problem-SolvingStrong analytical and problem solving skills to troubleshoot complex network issues in Production and Non-Production environments. Critical ThinkingAbility to think strategically and anticipate future network needs. CommunicationEffective communication skills to collaborate with teams and explain technical concepts to all stakeholders. Strong knowledge of Containers (for eg. docker / Podman etc) & related technologies like container registries, Dockerfiles, creating container images, deploying containers etc Strong working knowledge of Container Orchestration (Kubernetes) Strong knowledge of Linux basics including packing, package managers, working with system services, distro specific development, building source code into distro specific packages, pkg installation etc. Knowledge of working with Linux distros like Ubuntu Good knowledge of Virtualization like hypervisor, Virtual Machines, Bare Metal, Partitions etc and infrastructure/system management. Preferred technical and professional experience At least 12+ years of working experience with Linux distributions (Ubuntu/RHEL) in a production environment. Advanced Networking KnowledgeProficiency in Software Defined Network (SDN), network design, protocols, and infrastructure, including LAN/WAN, SDN, and cloud networking. Familiarity with IPv4/IPv6, QoS, MPLS, and BGP Strategic Infrastructure PlanningCandidate should have the ability to design scalable and resilient network architectures, plan for capacity, and ensure disaster recovery and business continuity. General understanding of private /public / hybrid cloud concepts General understanding of HW servers and server components

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies