Jobs
Interviews

112 Partitioning Jobs - Page 2

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 11.0 years

5 - 9 Lacs

bengaluru

Work from Office

About The Role Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP HANA DB Administration Good to have skills : SAP Basis AdministrationMinimum 5 year(s) of experience is required Educational Qualification : Design build and configure applications to meet business process and application requirements Summary :As an Application/Cloud Support professional with 5-6 years of experience in SAP HANA DB Administration, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. Your typical day will involve providing primary support for SAP HANA DB Administration and secondary support for SAP BASIS Administration, ensuring the smooth functioning of the applications. Roles & Responsibilities:- Provide third level support for SAP HANA DB Administration and Sybase Database to customers.- Ensure the smooth functioning of the applications by monitoring and resolving issues.- Collaborate with cross-functional teams to identify and resolve issues related to application performance.- Design, build, and configure applications to meet business process and application requirements.- Must be ok to work for 24*7 rotational shifts, the position is a shift based role. Professional & Technical Skills: - MustTo Have Skills: Experience in SAP HANA DB Administration.- Installing, Configuring and Maintaining HANA database- HANA DB certification prefered- Upgrades / HA DR / Backup / restore related troublshooting- Multi-node / multi-tenant administration- Collect runtime dumps, kernel profiler traces- Performance parameter tuning / Security compliance and hardening for HANA DB- User management/administration DB on Cloud platforms AWS, Azure etc- Good To Have Skills: Table Partitioning, MaxDB, Sybase DBA.- Experience in designing, building, and configuring applications to meet business process and application requirements.- Experience in monitoring and resolving issues related to application performance.- Experience in collaborating with cross-functional teams to identify and resolve issues related to application performance. Additional Information:- The candidate should have a minimum of 6+ years of experience in SAP HANA DB Administration.- The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful solutions.- This position is based at our Bengaluru office. Qualification Design build and configure applications to meet business process and application requirements

Posted 2 weeks ago

Apply

5.0 - 10.0 years

8 - 12 Lacs

pune

Work from Office

5+ years of overall experience with min 2 years of experience in Regulatory Reporting with exposure to the Axiom. Implementation experience preferably in the Banking domain. Transformation of business requirements into technical needs and implementation procedures. Ability to work independently. Good communication skills. Experience in Software Design and Development. Primarily working in the Data Warehousing and Regulatory Reporting area in Banking and Finance domain Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Should have good experience on simple and complex sql query writing using all type of joins, analytical functions, with clause, sub query, set operators. Very well verse and should have good exposure with Oracle SQL Loader, Partitioning, performance tuning. Should have good hands-on oracle procedural language like PL/SQL procedure, function, packages, views, temporary tables, collections, etc. Preferred to have knowledge on UNIX, shell scripting, Autosys jobs Preferred technical and professional experience Experience with Axiom Controller View Regulatory Reporting tool - version 9x/10x. Experience with Axiom implementation for different Regulations

Posted 2 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

pune, maharashtra

On-site

The ideal candidate should possess a strong knowledge of system functionality, architecture, partitioning, design, testing, and integration. Additionally, they should have expertise in hardware development processes, interfaces, EMC, and environmental specifications. The candidate must have experience working with automotive communication and Diagnostic protocols such as CAN, LIN, UDS, and should be proficient in using communication and diagnostic tools like CANoe, CANalyzer, ODX Studio, Indigo, etc. Educational qualifications include a Bachelor of Engineering in Mechanical with 8 to 10 years of work experience in the relevant field. Moreover, the candidate should be familiar with software architecture (AUTOSAR), development processes (SPICE), and tools such as DOORS, Clearcase, MATLAB, etc. As part of Tata Motors" Leadership Competencies, the candidate should excel in Developing Self and Others, Leading Change, Driving Execution, Leading by Example, Motivating Self and Others, and Customer Centricity. Overall, the selected candidate will be responsible for leveraging their expertise to contribute to the success of the organization and meet the needs and expectations of stakeholders.,

Posted 2 weeks ago

Apply

6.0 - 12.0 years

0 Lacs

kochi, kerala

On-site

You will be responsible for designing, implementing, optimizing, and maintaining PostgreSQL database systems. Working closely with developers and data teams, you will ensure high performance, scalability, and data integrity. Your key responsibilities will include developing complex SQL queries, stored procedures, and functions, optimizing query performance and database indexing, managing backups, replication, and security, monitoring and tuning database performance, as well as supporting schema design and data migrations. The ideal candidate should have 6 to 12 years of experience with strong hands-on experience in PostgreSQL. Proficiency in SQL and PL/pgSQL scripting is required. Experience in performance tuning, query optimization, and indexing is essential. Familiarity with logical replication, partitioning, and extensions is a plus. Exposure to tools like pgAdmin, psql, or PgBouncer will be beneficial for this role.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

As a Data Engineer specializing in Treasure Data Technology, you will play a crucial role in designing, implementing, and managing data pipelines using our customer data platform, Treasure Data. Your main responsibilities will include developing and maintaining scalable data pipelines, ensuring data quality and integrity, optimizing system performance, and collaborating with various stakeholders to deliver solutions that meet business needs. You will be expected to design, develop, and maintain data pipelines to facilitate efficient data flow and processing. This will involve implementing ETL processes to integrate data from different sources and ensuring the accuracy, completeness, and reliability of the data through robust validation and cleansing processes. Monitoring data quality metrics and addressing any discrepancies promptly will be essential in your role. To enhance system performance and reduce latency, you will need to optimize data storage and retrieval processes by implementing techniques such as indexing and partitioning. Collaboration with data scientists, analysts, and other team members is crucial to understanding data requirements and delivering effective solutions. Effective communication of technical concepts to non-technical team members is also a key aspect of your role. Ensuring data security measures are in place to protect sensitive information and comply with relevant regulations and standards will be part of your responsibilities. Regular audits and assessments to identify and mitigate potential security risks are vital to maintaining data security. In terms of essential skills, you should have a strong understanding of enterprise data warehousing principles, experience with Treasure Data and Dig Dag files, proficiency in Presto SQL and optimization techniques, as well as experience with Python, relational databases, and cloud platforms like AWS. Desirable skills include experience with version control systems like Git and knowledge of orchestration tools such as Airflow. Qualifications for this role include a minimum of 3 years of work experience with a focus on Treasure Data and technical certifications demonstrating your commitment to continuous learning. As a successful Data Engineer, you should possess qualities such as inherent curiosity and empathy for customer success, a dedication to solving customer problems, strong collaboration skills, a passion for experimentation and learning, excellent problem-solving abilities, attention to detail, and effective communication skills.,

Posted 3 weeks ago

Apply

13.0 - 23.0 years

0 Lacs

karnataka

On-site

As a skilled Database Administrator with 23 years of hands-on experience, you will be responsible for managing and optimizing relational database systems, focusing on MariaDB and ClickHouse. Your primary tasks will include ensuring high availability, performance, and security of mission-critical databases, as well as supporting automation, deployment, and troubleshooting efforts. Your responsibilities will include installing, configuring, and maintaining ClickHouse in both development and production environments. You will also support and manage at least one other RDBMS such as PostgreSQL, MySQL, or SQL Server. Designing and implementing High Availability (HA) and Disaster Recovery (DR) solutions for ClickHouse, optimizing query performance, automating database administration tasks, setting up monitoring, and implementing backup and recovery processes will be crucial aspects of your role. Collaborating with engineering teams to build scalable data solutions, troubleshooting performance issues, documenting system configurations, and ensuring compliance with security and data governance policies are also part of your responsibilities. Additionally, you will participate in disaster recovery testing and planning. To excel in this role, you should have 13 years of experience managing ClickHouse and other RDBMS, proficiency in SQL scripting and performance tuning, experience with replication, sharding, or clustering, familiarity with monitoring tools, proficiency in scripting for automation, exposure to CI/CD pipelines and DevOps tools, understanding of database security, networking, and compliance standards, and strong analytical thinking, communication, and problem-solving skills. If you are a proactive and experienced professional looking to contribute your expertise to a dynamic team, we encourage you to apply for this Database Administrator position.,

Posted 3 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

chandigarh

On-site

As a Data Architect with over 6 years of experience, you will be responsible for designing and implementing modern data lakehouse architectures on cloud platforms such as AWS, Azure, or GCP. Your primary focus will be on defining data modeling, schema evolution, partitioning, and governance strategies to ensure high-performance and secure data access. In this role, you will own the technical roadmap for scalable data platform solutions, ensuring they are aligned with enterprise needs and future growth. You will also provide architectural guidance and conduct code/design reviews across data engineering teams to maintain high standards of quality. Your responsibilities will include building and maintaining reliable, high-throughput data pipelines for the ingestion, transformation, and integration of structured, semi-structured, and unstructured data. You should have a solid understanding of data warehousing concepts, ETL/ELT pipelines, and data modeling. Experience with tools like Apache Spark (PySpark/Scala), Hive, DBT, and SQL for large-scale data transformation is essential for this role. You will be required to design ETL/ELT workflows using orchestration tools such as Apache Airflow, Temporal, or Apache NiFi. In addition, you will lead and mentor a team of data engineers, providing guidance on code quality, design principles, and best practices. As a subject matter expert in data architecture, you will collaborate with DevOps, Data Scientists, Product Owners, and Business Analysts to understand data requirements and deliver solutions that meet their needs.,

Posted 3 weeks ago

Apply

8.0 - 10.0 years

0 Lacs

chandigarh, india

On-site

JOB DESCRIPTION Job Summary If you are looking for an opportunity in Technology Solutions and Development, Emerson has this exciting role for you! The Senior ETL Developer - Oracle will be partof team of individuals whose responsibility is to develop ETL programs and to improve the performance of poorly written or poorly performing applications code in Oracle Data Integrator tool. This will include existing code and new code which have not yet been promoted to production. This team delivers technology solutions for strategic business needs, drives adoption of these services and support processes and boosts value by enhancing our customers experience. This role work along a hardworking and dedicated team of self-motivated professionals who share a collective passion for progress and excellence. In this Role, Your Responsibilities Will Be: Candidate will be responsible to design end to end solutions to cater to business needs of data in Oracle BI Tool stack especially ODI and SQL. Expertise in designing custom Data warehouse solutions specific to business needs. Expertise in Dimensional Modelling (using Star / Snowflake schema) and drafting High Level and Low Level DWH Design Expertise in preparing Data lineage sheets. Expertise in Data warehousing concepts like SCDs, Dimensional Modelling, Archive strategy, Aggregation, Hierarchy etc and Database concepts like Partitioning, Materialized views etc Expertise in ODI development (in BIAPPS Environment) & performance tuning of SQL programs. Experience of automating various ETL jobs, failure notification etc Ability to review and suggest DWH Design optimization solutions. Expertise in performance tuning of mappings in ODI and SQL query tuning. Production Support of Daily running ETL loads, monitoring, troubleshooting failures and bug fixing across environments Expertise in Multiple databases (Oracle/ SQL Server/etc) including complex SQL, query debug and optimization Should have strong understanding of Business Analytics, data warehouse analysis, design, development & testing Who You Are: You show a tremendous amount of initiative in tough situations are exceptional at spotting and seizing opportunities. You observe situational and group dynamics and select best-fit approach. You make implementation plans that allocate resources precisely. You pursue everything with energy, drive, and the need to finish. For This Role, You Will Need: 8+ years of relevant experience working in OBIA on ODI as the ETL tool in BIAPPS environment. Exposure to working in other ETL tools. Ability to work on highly complex problems and provide feasible solutions in time. Ability to review and suggest improvements in existing DWH solution. Ability to work in a demanding user environment. Ability to provide trainings on ETL, DWH and Dimensional modelling. Ability to guide and help team members in technical issues. Strong Written and Oral communication skills. Coordinating among various teams for day to day activity. Preferred Qualifications that Set You Apart: Bachelor's degree or equivalent in Science with a technical background (MIS, Computer Science, Engineering or any related field) Good interpersonal skills using English, both spoken and written, as will be working with overseas team Our Culture & Commitment to You At Emerson, we prioritize a workplace where every employee is valued, respected, and empowered to grow. We foster an environment that encourages innovation, collaboration, and diverse perspectives-because we know that great ideas come from great teams. Our commitment to ongoing career development and growing an inclusive culture ensures you have the support to thrive. Whether through mentorship, training, or leadership opportunities, we invest in your success so you can make a lasting impact. We believe diverse teams, working together are key to driving growth and delivering business results. We recognize the importance of employee wellbeing. We prioritize providing competitive benefits plans, a variety of medical insurance plans, Employee Assistance Program, employee resource groups, recognition, and much more. Our culture offers flexible time off plans, including paid parental leave (maternal and paternal), vacation and holiday leave.

Posted 3 weeks ago

Apply

2.0 - 3.0 years

3 - 5 Lacs

bengaluru

Work from Office

Reporting to the Senior Site Engineer, the Junior Site Engineer will support site activities by supervising daily execution, coordinating with subcontractors, monitoring material quality and quantities, and assisting with site documentation and reporting. Key Responsibilities: Monitor and supervise on-site activities such as carpentry, flooring, partitions, ceilings, painting, and other finishing works. Verify receipt of materials on site, check for damages or shortages, and report discrepancies immediately. Assist in preparing daily site progress reports with photographic evidence for internal records and client updates. Support quantity measurement and verification to aid in billing and vendor payments. Coordinate with vendors, subcontractors, and labor teams to ensure work proceeds as per schedule and quality expectations. Ensure all site activities comply with drawings, specifications, and safety standards. Report any delays, quality issues, or site challenges to the Senior Site Engineer or Project Manager promptly. Candidate Profile: Diploma or Bachelors degree in Civil Engineering, Interior Design, or related field. 2–3 years of experience working on commercial interior fit-outs or similar construction projects. Basic ability to read and interpret drawings and technical documents. Strong organizational and communication skills to coordinate effectively on site. Eagerness to learn and grow within a site management role.

Posted 3 weeks ago

Apply

4.0 - 6.0 years

5 - 8 Lacs

hyderabad

Hybrid

Must Have: - Expert in MSSQL database ( SQL queries, tables, index, stored procedure, partition, replication, failover etc), Unix Shell Scripting, Windows Server Good to Have: - Monitoring tools ELK/AppDynamics, SSM - Expert in alerting and Monitoring tools like ELK, AppDynamics, SSM etc - Hands on Automation using PowerShell script, UNIX script etc - Expert in MSSQL database ( SQL queries, tables, index, stored procedure, partition, replication, failover etc) - Good understanding on network concepts like load balancer, VIP, Pool members etc - Good understanding on certificates and how they should be used to protect customer data and meet banks regulatory requirements. - Quick self-learner and should be flexible and adaptable to learn new technologies. - Should have engineering mind set and should have good understanding on the retail loan business that runs on platform windows/java/MSSQL platform.

Posted 3 weeks ago

Apply

7.0 - 12.0 years

20 - 35 Lacs

gurugram

Work from Office

Qualification: B.Tech Timings: 9 am to 6 pm Mon & Fri (WFH) Tue/Wed/Thu (WFO) Job Overview: We are seeking an experienced Java Lead with over 7 years of hands-on experience in Java development, who will take ownership of designing and building scalable logging solutions. The ideal candidate should possess strong knowledge of partitioning, data sharding, and database management (both SQL and NoSQL) and should be well-versed in AWS cloud services. This is a critical role where you will lead a team to build reliable and efficient systems while ensuring high performance and scalability. Key Responsibilities: Lead Java Development: Architect, design, and implement backend services using Java, ensuring high performance, scalability, and reliability. Logging Solutions: Build and maintain robust logging solutions that can handle large-scale data while ensuring efficient retrieval and storage. Database Expertise:Implement partitioning, data sharding techniques, and optimize the use of SQL (MySQL, PostgreSQL) and NoSQL databases (MongoDB, DynamoDB). Ensure database performance tuning, query optimization, and data integrity. Cloud Deployment: Utilize AWS cloud services such as EC2, RDS, S3, Lambda, and CloudWatch to design scalable, secure, and high-availability solutions. Manage cloud-based infrastructure and deployments to ensure seamless operations. Collaboration & Leadership: Lead and mentor a team of engineers, providing technical guidance and enforcing best practices in coding, performance optimization, and design. Collaborate with cross-functional teams including product management, DevOps, and QA to ensure seamless integration and deployment of features. Performance Monitoring: Implement solutions for monitoring and ensuring the health of the system in production environments. Innovation & Optimization: Continuously improve system architecture to enhance performance, scalability, and reliability. Required Skills & Qualifications: Education: Bachelors or Masters degree in Computer Science, Information Technology, or related fields. Experience: 7+ years of hands-on experience in Java (J2EE/Spring/Hibernate) development. Database Skills: Strong experience with both SQL (MySQL, PostgreSQL) and NoSQL databases (MongoDB, Cassandra, DynamoDB). Proficiency in partitioning and data sharding. AWS Expertise: Deep understanding of AWS cloud services including EC2, S3, RDS, CloudWatch, and Lambda. Hands-on experience in deploying and managing applications on AWS. Logging and Monitoring: Experience in building and managing large-scale logging solutions (e.g., ELK stack, CloudWatch Logs). Leadership: Proven track record of leading teams, mentoring junior engineers, and handling large-scale, complex projects. Problem-Solving: Strong analytical and problem-solving skills, with the ability to debug and troubleshoot in large, complex systems. Soft Skills: Excellent communication, leadership, and teamwork skills. Ability to work in a fast-paced, dynamic environment. Preferred Qualifications: Experience with containerization technologies like Docker and orchestration tools like Kubernetes. Familiarity with microservices architecture and event-driven systems. Knowledge of CI/CD pipelines and DevOps practices.

Posted 3 weeks ago

Apply

3.0 - 8.0 years

15 - 25 Lacs

gurugram

Work from Office

Understands the process flow and the impact on the project module outcome. Works on coding assignments for specific technologies basis the project requirements and documentation available Debugs basic software components and identifies code defects. Focusses on building depth in project specific technologies. Expected to develop domain knowledge along with technical skills. Effectively communicate with team members, project managers and clients, as required. A proven high-performer and team-player, with the ability to take the lead on projects. Design and create S3 buckets and folder structures (raw, cleansed_data, output, script, temp-dir, spark-ui) Develop AWS Lambda functions (Python/Boto3) to download Bhav Copy via REST API and ingest into S3 Author and maintain AWS Glue Spark jobs to: partition data by scrip, year and month convert CSV to Parquet with Snappy compression Configure and run AWS Glue Crawlers to populate the Glue Data Catalog Write and optimize AWS Athena SQL queries to generate business-ready datasets Monitor, troubleshoot and tune data workflows for cost and performance Document architecture, code and operational runbooks Collaborate with analytics and downstream teams to understand requirements and deliver SLAs Technical Skills 3+ years hands-on experience with AWS data services (S3, Lambda, Glue, Athena) PostgreSQL basics Proficient in SQL and data partitioning strategies Experience with Parquet file formats and compression techniques (Snappy) Ability to configure Glue Crawlers and manage the AWS Glue Data Catalog Understanding of serverless architecture and best practices in security, encryption and cost control Good documentation, communication and problem-solving skills Nice-to-have skills Qualifications Qualifications 3-5 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred

Posted 4 weeks ago

Apply

3.0 - 6.0 years

4 - 8 Lacs

chennai

Work from Office

GCP- Google Big Query - Strong experience in data engineering or analytics with strong SQL expertise. hands-on experience with Google BigQuery in production environments. Strong understanding of BigQuery architecture, partitioning, clustering, and performance tuning. Experience with GCP data services such as Cloud Storage, Dataflow, Composer (Airflow), and Pub/Sub. Proficiency in data modeling techniques (star/snowflake schema, denormalization, etc.). Familiarity with scripting languages such as Python or Java for orchestration and transformation. Experience with CI/CD tools and version control (e.g., Git, Cloud Build). Solid understanding of data security and access control within GCP. Design, develop, and maintain scalable data pipelines using BigQuery and GCP-native tools. Optimize complex SQL queries and BigQuery jobs for performance and cost efficiency. Collaborate with business analysts, data scientists, and engineers to deliver actionable insights from large datasets. Build and manage data warehouses and data marts using BigQuery. Integrate BigQuery with other GCP services such as Cloud Storage, Dataflow, Pub/Sub, and Cloud Functions. Implement best practices for data modeling, data governance, and security within BigQuery. Monitor and troubleshoot data workflows and optimize storage/query performance. Participate in architecture discussions and contribute to overall data platform strategy.

Posted 4 weeks ago

Apply

2.0 - 5.0 years

4 - 8 Lacs

ahmedabad, gujarat, india

On-site

About the Role: Lead the execution of assignments, coordinate with team members, and ensure quality deliverables within the stipulated timelines. Conduct physical site visits to gather primary information, monitor project progress, and summarize findings and observations. Oversee civil engineering aspects of fit-out projects, including structural modifications, flooring, partitioning, and ceiling installations. Review on-site work and verify if construction is executed according to approved designs, specifications, and standards. Evaluate the quality of craftsmanship in the construction process and ensure that construction standards are upheld. Review invoices and bills from contractors and suppliers for materials, labor, and services, along with supporting documentation. Ensure compliance with quality standards and identify any issues related to materials or workmanship. Verify that materials, products, and construction practices comply with IGBC certification requirements. About You: Strong analytical skills with an eye for detail. Ability to manage multiple tasks and meet deadlines in a fast-paced environment. In-depth knowledge of construction contracts, billing practices, and IGBC standards. Strong understanding of civil engineering principles, building codes, and construction practices. Proficiency in reading and interpreting architectural and engineering drawings. Flexibility to travel extensively between multiple project locations and work on-site.

Posted 4 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

maharashtra

On-site

You are an experienced and highly skilled Senior AWS Data Engineer with over 8+ years of experience, ready to join our dynamic team. Your deep understanding of data engineering principles, extensive experience with AWS services, and proven track record of designing and implementing scalable data solutions make you the ideal candidate for this role. Your key responsibilities will include designing and implementing robust, scalable, and efficient data pipelines and architectures on AWS. You will develop data models and schemas to support business intelligence and analytics requirements, utilizing AWS services such as S3, Redshift, EMR, Glue, Lambda, and Kinesis to build and optimize data solutions. It will be your responsibility to implement data security and compliance measures using AWS IAM, KMS, and other security services, as well as design and develop ETL processes to ingest, transform, and load data from various sources into data warehouses and lakes. Ensuring data quality and integrity through validation, cleansing, and transformation processes, optimizing data storage and retrieval performance through indexing, partitioning, and other techniques, and monitoring and troubleshooting data pipelines for high availability and reliability will also be part of your role. Collaboration with cross-functional teams, providing technical leadership and mentorship to junior data engineers, identifying opportunities to automate and streamline data processes, and participating in on-call rotations for critical systems and services are also expected from you. Your required qualifications, capabilities, and skills include experience in software development and data engineering, with hands-on experience in Python and PySpark. You should have proven experience with cloud platforms such as AWS, Azure, or Google Cloud, a good understanding of data modeling, data architecture, ETL processes, and data warehousing concepts, and experience with cloud native ETL platforms like Snowflake and/or Databricks. Proven experience with big data technologies and services like AWS EMRs, Redshift, Lambda, S3, efficient Cloud DevOps practices, and CI/CD tools like Jenkins/Gitlab for data engineering platforms, as well as good knowledge of SQL and NoSQL databases including performance tuning and optimization, and experience with declarative infra provisioning tools like Terraform, Ansible, or CloudFormation will be valuable assets. Strong analytical skills to troubleshoot issues and optimize data processes, working independently and collaboratively, are also necessary for this role. Preferred qualifications, capabilities, and skills that would be beneficial for this role include knowledge of machine learning model lifecycle, language models, and cloud-native MLOps pipelines and frameworks, as well as familiarity with data visualization tools and data integration patterns.,

Posted 4 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a GCP Data Engineer-Technical Lead at Birlasoft Office in Bengaluru, India, you will be responsible for designing, building, and maintaining scalable data pipelines and platforms on Google Cloud Platform (GCP) to support business intelligence, analytics, and machine learning initiatives. With a primary focus on Python and GCP technologies such as BigQuery, Dataproc, and Data Flow, you will develop ETL and ELT pipelines while ensuring optimal data manipulation and performance tuning. Your role will involve leveraging data manipulation libraries like Pandas, NumPy, and PySpark, along with SQL expertise for efficient data processing in BigQuery. Additionally, your experience with tools such as Dataflow, Cloud Run, GKE, and Cloud Functions will be crucial in this position. A strong foundation in data modeling, schema design, data governance, and containerization (Docker) for data workloads will further enhance your contributions to our data team. With 5-8 years of experience in Data Engineering and Software Development, including a minimum of 3-4 years working directly with Google Cloud Platform, you will play a key role in driving our data initiatives forward.,

Posted 1 month ago

Apply

1.0 - 10.0 years

0 Lacs

navi mumbai, maharashtra

On-site

You are looking for a Database Developer Lead with Simple Logic IT Pvt Ltd at Seawood-Navi Mumbai, where you will be required to work from the office. It is essential to have a minimum of 10 years of experience, with expertise in team handling and Insurance Domain. Your primary responsibilities will include managing a team of more than 10 resources, working with large datasets, writing efficient SQL queries for data manipulation, transformation, and reporting. You will also be expected to participate in requirement gathering, impact analysis, and technical design discussions with stakeholders. Additionally, optimizing queries and tuning performance using tools like Explain Plan, SQL Trace, TKPROF, and AWR reports is crucial. A solid understanding of RDBMS concepts, indexing, partitioning, and data modeling is necessary. Having familiarity with the insurance life cycle and knowledge of MIS requirements is beneficial. Secondary skills required for this role include proficiency in PL/SQL and strong documentation skills. As a Database Developer Lead, you will need to have at least 1 year of experience in the Life Insurance Domain. Your responsibilities will involve working closely with business users to gather requirements, hands-on experience in PL/SQL, documenting technical designs, data flows, and process logic for long-term support and maintenance. Collaboration with DBAs, front-end developers, testers, and business analysts to deliver robust and scalable database solutions is essential. Educational qualification required is a minimum Bachelor Degree from an IT background. The ideal candidate for this position should be a seasoned PL/SQL Consultant with expertise in the Life Insurance domain and Oracle technologies. If you meet these requirements, please drop your resume at tejashree.mane@simplelogic.in.,

Posted 1 month ago

Apply

2.0 - 8.0 years

0 Lacs

thiruvananthapuram, kerala

On-site

At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. We're counting on your unique voice and perspective to help EY become even better. Join us and build an exceptional experience for yourself, and a better working world for all. We are looking for Senior level candidates with a good working experience in Data warehousing, Data Integration using ETL tool Oracle Data Integrator (ODI), Oracle SQL, and PL/SQL. As a problem-solver with a keen ability to diagnose a client's unique needs, you should be able to see the gap between where clients currently are and where they need to be. Your responsibilities include having 3-8 years of ETL Lead/developer experience and a minimum of 2-3 years experience in Oracle Data Integrator (ODI). You should have experience in developing ETL processes, data modeling, ETL design, setting up topology, building objects, monitoring operator, packaging components, database operations, error handling, automation, performance tuning, SQL/PLSQL development, data migration, SQL tuning, and optimization, among others. The ideal candidate should possess a BTech/MTech/MCA/MBA degree and have expertise in ODI tools, Oracle PL/SQL, data quality, reconciliation framework, integrating ODI with multiple sources/targets, and interacting with customers to understand business requirements. You must have strong communication skills, be able to work with minimal guidance in a time-critical environment, and have experience in the Financial Services industry. EY offers support, coaching, and feedback, along with opportunities for skill development and career progression. You will have the freedom and flexibility to handle your role in a way that suits you, and you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Join EY and be part of a team dedicated to building a better working world, creating long-term value for clients, people, and society, and building trust in the capital markets. Our diverse teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate. At EY, we ask better questions to find new answers for the complex issues facing our world today.,

Posted 1 month ago

Apply

8.0 - 12.0 years

0 Lacs

chandigarh

On-site

If you are seeking an opportunity in Technology Solutions and Development, Emerson has an exciting role for you! As a Senior ETL Developer - Oracle, you will be a part of a team responsible for developing ETL programs and enhancing the performance of applications code in Oracle Data Integrator tool. Your duties will involve working on existing code and developing new code to meet production standards. This role contributes to delivering technology solutions for strategic business needs, promoting service adoption, supporting processes, and enhancing customer experience. You will collaborate with a diligent team of professionals who share a passion for progress and excellence. Your responsibilities in this role include designing end-to-end solutions to meet data business requirements in Oracle BI Tool stack, particularly ODI and SQL. You will specialize in developing custom Data warehouse solutions tailored to business needs and possess expertise in Dimensional Modelling, preparing Data lineage sheets, Data warehousing concepts, and Database concepts. Proficiency in ODI development, performance tuning, automating ETL jobs, and optimizing DWH design are key aspects of this role. Additionally, you will provide production support for ETL loads, monitor performance, troubleshoot failures, and fix bugs across environments. To excel in this role, you must demonstrate a high level of initiative, problem-solving skills, and resource allocation abilities. Strong communication skills, the ability to work on complex problems and provide timely solutions, and a knack for guiding and assisting team members in technical matters are essential. A background in Science with technical expertise, interpersonal skills, and the ability to work in a demanding environment are preferred qualifications. At Emerson, we cultivate a workplace where every employee is respected, valued, and empowered to grow. We foster innovation, collaboration, and diverse perspectives to drive growth and achieve business results. We are committed to your ongoing career development and well-being, offering competitive benefits, medical insurance plans, employee resource groups, flexible time off, and support for your success. Join our team and make a lasting impact in a culture that prioritizes inclusivity and growth.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

You will be responsible for developing/enhancing objects in Oracle Data Integrator (ODI), RPD, and OACS Reports & Analyses. Your role will involve providing end-to-end solution architecture for integrating Oracle ERP and SCM Cloud applications using ODI and OACS. You will be tasked with ODI ELT Data Load Scheduling, Monitoring, and Troubleshooting, as well as applying data fixes. Furthermore, you will need to coordinate with the onshore team by setting day-to-day execution targets. Your expertise will be crucial in integration architecture design, solutions selection, implementation, and continuous technical improvement. Collaboration with a cross-functional team to resolve issues and guide team members to successful delivery will also be part of your responsibilities. To qualify for this role, you must have at least 5 years of demonstrable hands-on development experience using standard SDLC and/or Agile methodologies in an ODI/OACS/PLSQL developer role. Additionally, you should have 5+ years of experience in extracting data from Oracle ERP/SCM. Proficiency in developing ETL processes, such as ETL control tables, error logging, auditing, data quality, etc., and implementing reusability, parameterization, workflow design, is essential. Expertise in the Oracle ODI 12c toolset, Oracle PL/SQL, RPD, OACS, BICC, BI Publisher, as well as knowledge of data modeling and ETL design, are required. You should be able to integrate ODI with multiple sources and targets and have experience in error recycling/management using ODI. Strong knowledge of database objects development (SQL/PLSQL) and ELT/ETL concepts, design, and coding is expected. Moreover, expert knowledge of OBIEE/OAC RPD design and BI analytics reports design is necessary. Familiarity with BI Apps will be an advantage. Experience in creating PL/SQL packages, procedures, functions, triggers, views, materialized views, and exception handling for retrieving, manipulating, checking, and migrating complex datasets in Oracle is crucial. Your role will also involve devising partitioning and indexing strategies for optimal performance, leading support & development projects, and possessing good verbal and written communication skills in English. Strong interpersonal, analytical, and problem-solving abilities are essential. Any certifications in ODI or OACS will be a plus. Good knowledge of Oracle database and development experience in database applications, along with traits such as creativity, personal drive, influencing and negotiating skills, and problem-solving capabilities, are desired attributes for this role.,

Posted 1 month ago

Apply

10.0 - 14.0 years

0 Lacs

pune, maharashtra

On-site

As a highly experienced Linux/Exadata professional with over 10 years of hands-on experience in Linux/Solaris server administration, you have a strong background in Unix environments, particularly Solaris and Linux. You are adept at providing 24*7 shift support at client locations and possess expertise in installation, configuration, upgradation, patching, administration, and storage implementation. Your ability to articulate standard methodologies during implementation is commendable. Your skill set includes experience in Oracle VM Manager LDOM/OVM Administration, Oracle Engineered systems like Exadata, Exalogic, PCA, and Oracle storage. You have a proven track record in team management, handling high-demanding customers, performance analysis, troubleshooting, system design, and architecture. In addition to your extensive experience, you have a working knowledge of OS clustering, partitioning, virtualization, and storage administration, with a focus on integration with operating systems. Your expertise in Unix, particularly Solaris and Linux, is well-demonstrated. You have worked on OVM migration projects and have experience with Engineered systems like Exadata, Exalogic, PCA, including patching activities. As an individual contributor at Career Level IC3, you are capable of managing your work independently, displaying ambition, and working towards agreed targets and goals with a creative approach. Your strong interpersonal skills enable you to contribute effectively to team efforts and achieve related results as needed. You stay updated with the latest technical knowledge by attending educational workshops and reviewing relevant publications. Your familiarity with Exadata X10M, ZFS Storage, Private Cloud Appliance, and the ability to work in a fast-paced, 24*7 support environment make you a valuable asset to any organization seeking a skilled Linux/Exadata professional.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a Solaris Administrator, you will be responsible for the installation, implementation, customization, operation, recovery, and performance tuning of Solaris Operating Systems. Your role will involve installing and maintaining all Solaris server hardware and software systems, administering server performance and utilization, and ensuring availability. Additionally, you will be required to prepare program-level and user-level documentation as needed. Your key responsibilities will include supporting infrastructure implementations, deployments, and technologies related to dynamic infrastructure platforms. You will participate in current state system analysis, requirements gathering, and documentation. Moreover, you will contribute to the creation of technical design/implementation documentation and assist in requirements understanding and issue resolution. Furthermore, you will be involved in tasks such as maintaining and installing Oracle ZFS Storage, troubleshooting and maintaining Solaris Operating Systems (8, 9, 10, and 11), patching Solaris systems with Sun cluster and VCS, configuring APACHE web server on Solaris and Linux, creating and extending Volume Groups and file systems, resolving sudo issues, working with VERITAS volume manager and cluster, managing users and groups in NIS and LDAP servers, and installing, upgrading, and patching Solaris servers. You will also handle Solaris server decommissioning, VERITAS cluster monitoring, starting and stopping cluster services, moving resource groups across nodes, increasing file systems in cluster file systems, synchronizing cluster resources, and creating and deleting new cluster service groups and resources. Your expertise should include Solaris server performance monitoring, kernel tuning, and troubleshooting. Additionally, you should have experience working with ticketing tools like Remedy and ManageNow, knowledge of OS clustering, partitioning, virtualization, and storage administration, integration with operating systems, and the ability to troubleshoot capacity and availability issues. You will collaborate with project teams to prepare components for production, provide support for ongoing platform infrastructure availability, and work on prioritized features for ongoing sprints. In this role, you will be accountable for completing the work you lead and deliver quality work to the team. The position falls under the IT Support category with a salary as per market standards. The industry focus is on IT Services & Consulting within the functional area of IT & Information Security. This is a full-time contractual employment opportunity.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

navi mumbai, maharashtra

On-site

The job involves coordinating with all departments of the client to understand their requirements and functional specifications. You must have a strong knowledge of TSYS PRIME, SQL, and Oracle PL/SQL languages, as well as familiarity with APIs. Your responsibilities will include participating in various phases of the SDLC such as design, coding, code reviews, testing, and project documentation, while working closely with co-developers and other related departments. Desired Skills and Qualifications: - Strong knowledge of TSYS PRIME, Oracle PL/SQL language, and APIs - Good exposure to Oracle advanced database concepts like Performance Tuning, Indexing, Partitioning, and Data Modeling - Responsible for database-side development, implementation, and support - Experience in solving daily service requests, incidents, and change requests - Proficient in code review, team management, effort estimation, and resource planning This is a full-time position with a day shift schedule that requires proficiency in English. The work location is in person.,

Posted 1 month ago

Apply

8.0 - 12.0 years

0 Lacs

noida, uttar pradesh

On-site

Cadence is a pivotal leader in electronic design, leveraging over 30 years of computational software expertise. Our Intelligent System Design approach enables us to provide software, hardware, and IP solutions that bring design concepts to life. Our clientele comprises the most innovative companies globally, developing cutting-edge electronic products for diverse market applications such as consumer electronics, hyperscale computing, 5G communications, automotive, aerospace, industrial, and health sectors. At Cadence, you will have the opportunity to work with the latest technology in a stimulating environment that fosters creativity, innovation, and meaningful contributions. Our employee-centric policies prioritize the well-being of our staff, career growth, continuous learning opportunities, and recognizing achievements tailored to individual needs. The "One Cadence One Team" culture encourages collaboration across teams to ensure customer success. We offer various avenues for learning and development based on your interests and requirements, alongside a diverse team of dedicated professionals committed to exceeding expectations daily. We are currently seeking a Database Engineer with a minimum of 8 years of experience to join our team in Noida. The ideal candidate should possess expertise in both SQL and NoSQL databases, particularly PostgreSQL and Elasticsearch. A solid understanding of database architecture, performance optimization, and data modeling is essential. Proficiency in graph databases like JanusGraph and in-memory databases is advantageous. Strong skills in C++ and design patterns are required, with additional experience in Java and JS being desirable. Key Responsibilities: - Hands-on experience in PostgreSQL, including query tuning, indexing, partitioning, and replication. - Proficiency in Elasticsearch, covering query DSL, indexing, and cluster management. - Expertise in SQL and NoSQL databases, with the ability to determine the appropriate database type for specific requirements. - Proven experience in database performance tuning, scaling, and troubleshooting. - Familiarity with Object-Relational Mapping (ORM) is a plus. If you are a proactive Database Engineer looking to contribute your skills to a dynamic team and work on challenging projects at the forefront of electronic design, we encourage you to apply and be part of our innovative journey at Cadence.,

Posted 1 month ago

Apply

3.0 - 7.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Job Overview : We are seeking an exceptional Physical Verification Engineer to take a key role in oursemiconductor design team. As a Block/Fullchip/Partition Physical Verification Engineer , you willResponsible for development and implementation of cutting-edge physical verification methodologiesand flows for complex ASIC designs. You will collaborate closely with cross-functional teams to ensurethe successful delivery of high-quality designs Responsibilities : Drive physical verification DRC, Antenna, LVS, ERC at cutting edge FinFET technology nodesfor various foundries. Physical verification of a complex SOC/ Cores/ Blocks DRC, LVS, ERC, ESD, DFM, Tape out. Work hands-on to solve critical design and execution issues related to physical verificationand sign-off. Own physical verification and sign-off flows, methodologies and execution of SoC/cores. Good hands on Calibre, Virtuoso etc. Requirements : Bachelors or Masters degree in Electrical Engineering or Electronics & Communications. Proficiency in industry-standard EDA tools from Cadence, Synopsys and Mentor Graphics. Strong scripting skills using TCL, Python, or Perl for design automation and tool customization. Expertise in physical verification of Block/Partition/ Full-chip-level DRC, Experience and understanding of all phases of the IC design process from RTL-GDS2. LVS, ERC, DFM Tape out process on cutting edge nodes, Preferably worked on 3nm/5nm/7nm/12nm/14nm/16nm nodes at the major foundries Experience in debugging LVS issues at chip-level/block level with complex analog-mixed signal IPs Experience with design using low-power implementation (level-shifters, isolation cells, power domain/islands, substrate isolation etc.) Experience in physical verification of I/O Ring, corner cells, seal ring, RDL routing, bumps and other full-chip components Good understanding of CMOS/FinFET process and circuit design, base layer related DRCs, ERC rules, latch-up etc. Experience with ERC rules and ESD rules has an added advantage Outstanding communication and interpersonal skills, with the ability to collaborate effectively in a team environment. Proven ability to Engineer and mentor junior engineers, fostering their professional growth and development. Preferred qualifications: Experience with advanced process nodes 3nm, 5nm, 7nm, 10nm including knowledge of FinFET technology. Proven track record with multiple successful final production tape-outs Proven ability to independently deliver results and be able to work hands-on as and guide/help peers to deliver their tasks Be able to work under limited supervision and take complete accountability. Excellent written and verbal communication skills Knowledge on Handling various custom IP such as PLL, Divider, Serdes, ADC, DAC, GPIO, HSIO for PD integration and Physical verification challenges.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies