Jobs
Interviews

452 Sql Scripting Jobs - Page 6

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

12.0 - 16.0 years

0 Lacs

hyderabad, telangana

On-site

As a Reporting and Analytics Lead at HSBC, you will play a crucial role in managing a cross-functional team, strategic external suppliers, and business stakeholders. You will set clear development priorities, performance expectations, and accountability measures for supplier teams. Proactively managing risks, issues, and changes in scope will be essential to ensure alignment with business objectives. Your responsibilities will include reporting regularly on project status, updating Jira and Confluence, preparing project plans, providing mentoring and guidance to team members, and fostering a culture of collaboration and accountability focused on delivery excellence. You will oversee the ingestion and transformation of data from automated feeds and manual sources, implement robust data validation processes, and lead the transformation of legacy data into scalable, modular platforms. Driving automation, reducing manual interventions, and defining enterprise data standards will be key aspects of your role. Additionally, you will design fault-tolerant ETL/ELT pipelines, ensure data integrity across all stages of analysis, and mitigate risks associated with decision-support systems through validation and testing. In this role, you will act as a strategic partner in gathering and refining business requirements, conducting impact assessments, and translating business needs into clear documentation. You will build internal capability around data standardization, automation best practices, and documentation, ensuring that solutions meet both functional and non-functional business requirements. Engaging with business leaders and technical teams, you will facilitate decision-making, alignment, and lead workshops, presentations, and status meetings with diverse audiences. To be successful in this role, you should possess a Master's degree in Business, Computer Science, Engineering, or related fields, along with 12+ years of experience in project management, enterprise data infrastructure, or engineering roles. Strong background in Business Analytics, data standards, governance frameworks, and familiarity with data pipeline tooling, automation practices, and version control are required. Hands-on experience with data transformation tools, basic knowledge of Python and SQL scripting, and relevant certifications such as PMP, PRINCE2, Agile/Scrum, or CBAP are preferred. A background in Financial Services, Banking, or Enterprise IT environment is advantageous, as well as deep expertise in SQL Server, GCP platform, and large-scale ETL/ELT architecture. Your combination of technical skills, analytical acumen, collaborative abilities, and leadership mindset will enable you to contribute effectively to enhancing operational excellence and informed decision-making within the organization. Join HSBC and make a real impression by leveraging your expertise in reporting and analytics to drive impactful outcomes.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

thiruvananthapuram, kerala

On-site

You will be responsible for the development of web/window applications using the .NET programming components. This includes customizing current applications to align with changing business requirements. Your role will involve completing analysis, design, coding, and implementation phases of the software development life cycle within the scheduled time frame. You will also need to interact with superiors and business users to capture requirements effectively. To excel in this role, you should have experience in ASP.NET, C#, VB.NET/ JavaScript, jQuery, VBScript, HTML, XML, Ajax, ADO.NET/SQL Server. Familiarity with Angular is preferable, and knowledge of MVC, WCF, WPF, design patterns, LINQ, and Lambda expressions is desirable. Experience in SQL scripting, stored procedures, functions, Source & Configuration Management using Git/SVN, API integrations, and Cloud Development will be advantageous. The position is based at Technopark Phase-1, Trivandrum, Kerala, and requires a minimum of 3 to 4 years of relevant experience. This is a permanent role with a work from office mode. If you are ready to take the next step in your career, please send your CV and details to career.mpt@muthoot.com. Join us and grow professionally!,

Posted 1 month ago

Apply

10.0 - 14.0 years

0 Lacs

chennai, tamil nadu

On-site

You will be responsible for developing, deploying, monitoring, and maintaining ETL Jobs as well as all data engineering and pipeline activities. Your role will involve having a good understanding of DB activities and providing support in DB solutions. Additionally, you must possess proven expertise in SQL queries. Your key responsibilities will include designing and constructing various enterprise procedure constructs using any ETL tool, preferably PentahoDI. You will be expected to provide accurate work estimates, manage efforts across multiple lines of work, design and develop exception handling and data cleansing/standardization procedures, gather requirements from various stakeholders related to ETL automation, as well as design and create data extraction, transformation, and load functions. Moreover, you will be involved in data modeling of complex large data sets, conducting tests, validating data flows, preparing ETL processes according to business requirements, and incorporating all business requirements into design specifications. As for qualifications and experience, you should hold a B.E./B.Tech/MCA degree with at least 10 years of experience in designing and developing large-scale enterprise ETL solutions. Prior experience in any ETL tool, primarily PentahoDI, and a good understanding of databases along with expertise in writing SQL queries are essential. In terms of skills and knowledge, you should have experience in full lifecycle software development and production support for DWH systems, data analysis, modeling, and design specific to a DWH/BI environment. Exposure to developing ETL packages and jobs using SPOON, scheduling Pentaho ETL Jobs in crontab, as well as familiarity with Hadoop, Hive, PIG, SQL scripting, data loading tools like Flume, Sqoop, workflow/schedulers like Oozie, and migrating existing dataflows into Big Data platforms are required. Experience in any open-source BI and databases will be considered advantageous. Joining us will provide you with impactful work where you will play a pivotal role in safeguarding Tanla's assets, data, and reputation in the industry. You will have tremendous growth opportunities as part of a rapidly growing company in the telecom and CPaaS space, with chances for professional development. Moreover, you will have the opportunity to work in an innovative environment alongside a world-class team, where innovation is celebrated. Tanla is an equal opportunity employer that champions diversity and is committed to creating an inclusive environment for all employees.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

thiruvananthapuram, kerala

On-site

The job involves developing web/windows applications using .NET programming components and customizing current applications to meet changing business requirements. As a candidate, you will be responsible for completing analysis, design, coding, and implementation phases of the software development life cycle within the scheduled time frame. Interaction with superiors and business users for requirement gathering is a key aspect of this role. You should have experience in ASP.NET, C#, VB.NET, JavaScript, jQuery, VBScript, HTML, XML, and Ajax. Proficiency in ADO.NET and SQL Server is required. Knowledge of Angular is preferable, and familiarity with MVC, WCF, WPF, design patterns, LINQ, Lambda expressions, SQL scripting, stored procedures, and functions is desirable. Experience in Source & Configuration Management using Git/SVN, API integrations, and Cloud Development will be advantageous. This position is based in Technopark Phase-1, Trivandrum, Kerala, and requires 3 to 4 years of experience. It is a permanent role with a work from office mode. If you are ready to take the next step in your career, please send your CV and details to career.mpt@muthoot.com and join us in shaping the future of software development.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

A career in our Advisory Acceleration Centre is the natural extension of PwC's leading-class global delivery capabilities. We provide premium, cost-effective, high-quality services that support process,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

You are a skilled and experienced SAP HANA Developer, sought to join an innovative team. Your role will involve designing, developing, and optimizing data models and applications on the SAP HANA platform. Your expertise in HANA architecture, SQL scripting, and performance tuning will be crucial. A detail-oriented approach and solid understanding of data integration are essential for success in this position. You should have a minimum of 5 years of experience in SAP HANA Development. Candidates with a bachelors/master's degree and additional certifications in Digital Analytics, Computer Science, or related areas will be preferred. Your responsibilities will include writing and optimizing SQL scripts, stored procedures, and functions in SAP HANA to manage complex business logic and data transformations. Designing and developing data models in SAP HANA using tools like Calculation Views, Attribute Views, and Analytic Views to enhance performance and support. You will be required to analyze and optimize the performance of SAP HANA queries, models, and applications to ensure efficient data retrieval and processing. Integration of SAP HANA with other systems (such as SAP ERP, SAP BW) using SLT, SDI, or other data provisioning techniques will be part of your tasks. Providing ongoing support for SAP HANA applications by troubleshooting issues, applying patches or updates, and ensuring system stability is a key responsibility. Your skillset should include expert knowledge of EDW implementations, particularly focusing on Data Modeling, ETL, and SAP HANA. Experience with the lifecycle of Information Management projects, including requirement analysis, design, development, and testing. Proficiency in PL-SQL and SAP HANA is required. Exceptional communication and collaboration skills are expected. Demonstrated experience working in an Agile environment and proven ability to adapt to changing priorities and collaborate effectively with cross-functional teams are essential. The work location for this position is at the BSH Whitefield Office in Bangalore.,

Posted 1 month ago

Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

As a Data Testing Engineer at Fractal's ReBoot returnship program, you will be responsible for supporting the deployment and monitoring of quality standards. You will collaborate with engineers across teams to develop a Quality Assurance framework for the organization, with a focus on the Data Management & Platforms team. Your role will involve driving quality throughout the data product lifecycle by developing test cases, automation scripts, and analyzing the system to ensure the consistent delivery of high-quality products. Your responsibilities will include participating in QA teams by managing daily activities, providing training and mentoring, and working with stakeholders to enhance quality standards and testing procedures. You will develop test plans, scenarios, and cases for various testing activities, define and implement standard QA processes, and collaborate with product and development teams to ensure the testing roadmap is aligned with the product goals. To succeed in this role, you should possess 2 to 6 years of software testing experience, with at least 2 years as a QA Engineer or in a similar capacity. You must have expertise in ETL testing, data pipelines, Python, SQL scripting, automation testing, and QA methodologies. Experience with DevOps, Agile environments, and communication skills to engage with stakeholders effectively are essential. Additional skillsets preferred for this role include QA management experience, familiarity with professional software engineering practices, improving product test coverage, and Selenium Automation testing using Python and Pytest. A bachelor's degree in Information Systems, Computer Science, Engineering, or a related field is required for this position. If you are passionate about ensuring the quality of data products, collaborating with diverse teams, and contributing to a dynamic work environment, then this opportunity at Fractal's ReBoot program may be the perfect fit for you. Join us to embark on a rewarding journey of professional growth and innovation!,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

You will be joining as a Data Engineer in Hyderabad (Work from Office) with expertise in data engineering, ETL, and Snowflake development. Your primary responsibilities will include SQL scripting, performance tuning, Matillion ETL, and working with cloud platforms such as AWS, Azure, or GCP. A strong proficiency in Python or scripting languages, API integrations, and knowledge of data governance is essential for this role. Possession of Snowflake certifications (SnowPro Core/Advanced) is preferred. As a Data Engineer, you should have a minimum of 5+ years of experience in data engineering, ETL, and Snowflake development. Your expertise should encompass SQL scripting, performance tuning, and a solid understanding of data warehousing concepts. Hands-on experience with Matillion ETL for creating and managing ETL jobs is a key requirement. Additionally, you should demonstrate a strong understanding of cloud platforms (AWS, Azure, or GCP) and cloud-based data architectures. Proficiency in SQL, Python, or other scripting languages for automation and data transformation is crucial for this role. Experience with API integrations and data ingestion frameworks will be advantageous. Knowledge of data governance, security policies, and access control within Snowflake environments is also expected. Excellent communication skills are essential as you will be required to engage with both business and technical stakeholders. Being a self-motivated professional capable of working independently and delivering projects on time is highly valued in this position. The ideal candidate will possess expertise in data engineering, ETL processes, Snowflake development, SQL scripting, and performance tuning. Hands-on experience with Matillion ETL, cloud platforms (AWS, Azure, or GCP), and API integrations is crucial. Proficiency in Python or scripting languages along with knowledge of data governance, security policies, and access control will be beneficial for excelling in this role.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

A career in our Advisory Acceleration Centre is the natural extension of PwC's leading class global delivery capabilities. We provide premium, cost-effective, high-quality services that support process,

Posted 1 month ago

Apply

4.0 - 10.0 years

0 Lacs

noida, uttar pradesh

On-site

This is a full-time on-site role for a SQL Developer located in Noida. As a SQL Developer, you will be responsible for database development, ETL (Extract Transform Load), database design, analytical skills, and data modeling on a day-to-day basis. You should possess a Bachelor's degree or equivalent in Computer Science or a related field, along with at least 4-10 years of industry experience. Experience in working with SQL relational database management systems is essential, and SQL scripting knowledge would be a plus. Your responsibilities will include creating interfaces for upstream/downstream applications, designing, building, testing, deploying, and scheduling the Integration process involving third party systems. In this role, you will be involved in designing and developing integrations using Boomi AtomSphere integration platform or Workforce Integration Manager or similar Integration Tools. Knowledge of Rest API, SOAP framework, XML, and Web service design would be beneficial. Strong oral and written communication skills, as well as good customer interfacing skills, are required for this position. Other responsibilities will include implementing software in various environments using Professional Services concepts, following the SDLC process to provide solutions for Interfaces, understanding client requirements, preparing design documents, coding, testing, deploying interfaces, providing User Acceptance Testing support, deploying and releasing to production environment, and handing off to global support.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

We are looking for candidates to fill two key roles in Teamcenter Migration: Migration IC and Migration Testing. The ideal candidate should have a mix of technical and non-technical skills as detailed below. For the position of Teamcenter Migration IC: Technical Skills: - Proficiency in Teamcenter Implementation - Basic knowledge in ITK Customization, RAC Customization, and Teamcenter Configuration - Intermediate level skills in Teamcenter User - Basic understanding of Data Migration including ETL Toolset and Teamcenter Export / Import - Intermediate expertise in C/C++ Development, Java, and SQL Scripting Non-Technical Skills: - Advanced capabilities in Working in Teams - Basic skills in Mentoring and Module Lead roles - Advanced proficiency in English For the role of Teamcenter Migration Testing: Technical Skills: - Minimum 3-4 years of hands-on experience in functional, regression, and system testing for Teamcenter PLM projects - Experience in Data Testing and Usability tests based on Migration Data Loads - Good understanding of Teamcenter Functionalities and usage - At least 2 years of hands-on experience in Active Workspace Testing - Ability to analyze customer requirements, translate business use cases into test cases, and document Test Results - Familiarity with Defect Lifecycle management, defect severity, and priority - Knowledge of the Data Migration Process - Support for key users during UAT and SRT - Excellent communication skills and interaction with Test Managers - Experience with ALM test management tool is a plus - Knowledge of Agile framework If you meet the qualifications and are interested in these roles, please apply at: [Apply Here](https://mechispike.com/quickapplication/) To explore more opportunities in PLM with MechiSpike Solutions, visit: [Openings with MechiSpike Solutions](https://mechispike.com/openings/) Learn more about Mechispike, our values, and work culture at: [About Mechispike](https://mechispike.com/about-us/) MechiSpike is a PLM/CAx services company with a strong team delivering projects in Teamcenter, Enovia & Windchill. We offer learning opportunities, earning potential, and career growth paths to all employees, ensuring zero attrition. Find out about the Employee benefits & work culture here: [Employee benefits & work culture](https://mechispike.com/careers/) For Mechispike reviews, visit: [Mechispike Reviews](https://maps.app.goo.gl/UFzmomThT7pSVZY5A) We handle various projects across the PLM spectrum including staffing, Development, Migration, Upgrade, Support, or implementation, and are proud partners with global companies in India, Germany, and the USA.,

Posted 1 month ago

Apply

7.0 - 12.0 years

0 Lacs

karnataka

On-site

As an experienced SAP HANA Cloud Developer with 7-12 years of expertise in SAP Analytics, you will be responsible for designing, developing, and optimizing complex data models in SAP HANA Cloud. Your role will also involve developing and maintaining SAC reports and dashboards. You will play a key role in implementing best practices for new dashboard SAC implementations on architectural and conceptual levels. Working end to end in HANA Cloud modeling and SAC development, you will collaborate closely with the Business team to gather requirements for HANA and SAC new developments. In this position, you will engage with cross-functional teams, including data engineers, developers, and business users to ensure seamless project execution. Your responsibilities will include documenting data model design, metadata, and data lineage. You should possess a strong understanding of calculation views and various transformations used in SAP HANA, such as star join node, aggregation, rank, and others. Preference will be given to SAP certified candidates who have expertise in performance tuning and optimization in HANA Cloud. Hands-on experience with SAP Business Application Studio is mandatory for this role. You should demonstrate strong analytical and SQL scripting skills, along with proficiency in security concepts like roles and analytical privileges in HANA Cloud. Expertise in creating and managing SAC BI analytical models and story creation is essential. Additionally, you should have an understanding of different connection types supported in SAC and the ability to create stories on SAC live connections. Familiarity with various tables of S/4 FICO module and modeling in SAP Datasphere is preferred. Knowledge of SAP GEN AI topics like vector engine and knowledge graph will be beneficial. Experience with CDS View development is considered an added advantage for this role. Excellent communication skills are necessary to effectively collaborate with stakeholders and convey complex technical information.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

haryana

On-site

As a professional Database Administrator (DBA), your main responsibility will be to ensure the database operates smoothly and efficiently 24/7. Your objective is to facilitate a seamless flow of information within the organization, focusing on both the backend data structure and frontend accessibility for end-users. You will be tasked with various responsibilities including database management, installation and configuration of big data technologies, monitoring and optimization of database performance, backup and recovery strategies, implementing security measures, collaborating with other IT professionals, capacity planning, maintaining documentation, automation of tasks, staying updated on emerging technologies, providing education and training, working closely with data architects, problem resolution, and ensuring compliance with relevant data protection regulations and industry standards. To be successful in this role, you should have at least 5 years of proven working experience as a Database Administrator, a Bachelor's Degree preferably in Computer Science or a similar field, certifications in Oracle or SQL, fluency in English (written and spoken), hands-on experience with SQL/Oracle database standards and end-user applications, excellent knowledge of data backup, recovery, security, integrity, and SQL scripting, familiarity with programming languages API, problem-solving skills, and the ability to think algorithmically. Additionally, previous experience with DBA case tools (frontend/backend) and third-party tools, preferably experience with AWS migrations/implementations, and the willingness to collaborate with diverse team members are essential. You should also be comfortable working with controlled goods and technologies subject to regulations like ITAR or EAR. MKS is an equal opportunity employer and is committed to providing reasonable accommodations to qualified individuals with disabilities. If you require a reasonable accommodation during the application or interview process due to a disability, please contact us at accommodationsatMKS@mksinst.com. When applying for a specific job, remember to include the requisition number, title, and location of the role to facilitate the recruitment process.,

Posted 1 month ago

Apply

3.0 - 8.0 years

0 Lacs

hyderabad, telangana

On-site

As an experienced SAP BW on HANA Consultant, you will be responsible for supporting the design, development, and implementation of SAP Business Warehouse solutions powered by HANA. Your deep expertise in SAP BW, strong HANA modeling skills, and experience in integrating data from various sources will be key in driving business insights. Your key responsibilities will include designing, developing, and maintaining data models in SAP BW on HANA such as InfoObjects, ADSOs, CompositeProviders, and Open ODS Views. You will create and manage data extraction, transformation, and loading processes using DataSources and Data Transfer Processes. Developing HANA views for performance-optimized reporting and optimizing existing data models for better performance on HANA will also be part of your role. Collaborating with business stakeholders to understand reporting requirements and translating them into technical solutions will be essential. You will also build and maintain BEx queries, support integration with SAP Analytics tools, and participate in system upgrades, migrations to BW/4HANA, and performance tuning activities. Documenting technical designs, data flows, and processes will also be a key part of your responsibilities. To be successful in this role, you should have 3-8 years of experience in SAP BW on HANA, strong hands-on experience with HANA modeling, and experience in BW/4 HANA. Knowledge of SQL scripting, SAP Datasphere, data warehousing concepts, ABAP routines, SLT, BODS, or SDA/SDI for data integration is required. Familiarity with SAP BW/4HANA and migration projects is a plus. Good communication skills and the ability to work with cross-functional teams are essential. Preferred qualifications include SAP BW or BW/4HANA certification and familiarity with S/4HANA embedded analytics. If you possess the required skills and experience, we invite you to join our team and contribute to the success of our SAP Business Warehouse solutions powered by HANA.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a Software Engineer specializing in Data Encryption, you will be responsible for utilizing your strong hands-on experience with data encryption and decryption using Protegrity tool. Your expertise in this area will be crucial in providing production support and ensuring the security of sensitive data. Additionally, your proficiency in SQL scripting will enable you to handle and manage critical tasks effectively, even under challenging circumstances. Your role will also involve troubleshooting issues and implementing error handling strategies to maintain the integrity of the encryption process. Being a quick learner and a good team player is essential in this position, as you will be expected to adapt to new technologies and collaborate with colleagues effectively. Moreover, your flexibility to work in shifts will be beneficial in ensuring continuous support and maintenance of data encryption processes. If you are seeking a dynamic role where your skills in data encryption, production support, and SQL scripting are valued, this opportunity as a Software Engineer specializing in Data Encryption could be the perfect fit for you.,

Posted 1 month ago

Apply

7.0 - 8.0 years

20 - 27 Lacs

Chennai

Work from Office

Critical Skills to Possess: 6–8 years of hands-on experience with Microsoft SQL Server in high-volume production environments. Strong proficiency in RMAN , ASM , and PL/SQL scripting. Deep experience in SQL Server internals , SSMS , T-SQL , backup/restore strategies, and maintenance plans. Experience with monitoring tools like OEM , SQL Sentry , SolarWinds DPA , or Nagios/Zabbix . Comfortable working in Linux/Unix as well as Windows Server environments. Preferred Qualifications: Microsoft Certified: Azure Database Administrator Associate . Familiarity with cloud database platforms (e.g.Azure SQL Database). Experience with DevOps , CI/CD pipelines , and Infrastructure-as-Code (IaC) for DB deployment. Proven ability to work collaboratively across globally distributed teams and support 24x7 production environments. Preferred Qualifications: BS degree in Computer Science or Engineering or equivalent experience Roles and Responsibilities Roles and Responsibilities: 6–8 years of hands-on experience with Microsoft SQL Server in high-volume production environments. Strong proficiency in RMAN , ASM , and PL/SQL scripting. Deep experience in SQL Server internals , SSMS , T-SQL , backup/restore strategies, and maintenance plans. Experience with monitoring tools like OEM , SQL Sentry , SolarWinds DPA , or Nagios/Zabbix . Comfortable working in Linux/Unix as well as Windows Server environments. Preferred Qualifications: Microsoft Certified: Azure Database Administrator Associate . Familiarity with cloud database platforms (e.g.Azure SQL Database). Experience with DevOps , CI/CD pipelines , and Infrastructure-as-Code (IaC) for DB deployment. Proven ability to work collaboratively across globally distributed teams and support 24x7 production environments.

Posted 1 month ago

Apply

4.0 - 9.0 years

12 - 17 Lacs

Hyderabad

Work from Office

Job Description Some careers shine brighter than others, If youre looking for a career that will help you stand out, join HSBC and fulfil your potential Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further, HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions, We are currently seeking an experienced professional to join our team in the role of Senior Consultant Specialist, In this role, you will: Cassandra Database Administration: Design, configure, and maintain Cassandra clusters deployed in HSBC, Install, configure, and manage multi-node Cassandra clusters both on-premise and cloud, both public and internal Perform database backups, recovery, and restoration activities, Manage database schemas, tables, indexes, and user permissions, Diagnose and troubleshoot production incidents and resolve performance bottlenecks, including slow queries and resource utilization issues, Optimize database configurations, query performance, and data modeling for improved efficiency and scalability, Execute repairs in case of data inconsistency and perform JVM tuning and garbage collection optimization, Estimate storage requirements and collaborate with infrastructure teams to ensure adequate resources are provisioned, Regularly review and update security configurations to address emerging threats and vulnerabilities, Set up database monitoring tools to proactively identify performance issues, errors, and anomalies, Respond to database alerts and take appropriate actions to resolve issues, PostgreSQL Database Administration: Collaborate with teams to integrate PostgreSQL databases into the infrastructure and ensure their optimal performance, Perform PostgreSQL database backups, recovery, and restoration activities, Optimize PostgreSQL configurations and query performance for transactional workloads, Design and implement PostgreSQL data models and SQL scripts for transactional workloads, Manage PostgreSQL databases in GCP environments (preferred), Diagnose and troubleshoot PostgreSQL production incidents and resolve performance bottlenecks, Requirements To be successful in this role, you should meet the following requirements: Cassandra: 5+ years of experience in Cassandra database administration, Expertise in installing, configuring, and monitoring Cassandra clusters, Experience in creating column families, bootstrapping, decommissioning, removing, replacing, and repairing nodes, Proficiency in creating keyspaces, tables, and secondary indexes in Cassandra, Strong experience in performance tuning Apache Cassandra clusters to optimize writes and reads, Hands-on experience with Cassandra data modeling and CQL scripting, Proven ability to evaluate database performance, isolate bottlenecks, and implement optimizations in transactional settings, Experience in managing and monitoring 24x7 production database environments, PostgreSQL: 5+ years of experience in PostgreSQL database administration, Expertise in installing, configuring, and monitoring PostgreSQL databases, Proficiency in PostgreSQL data modeling and SQL scripting, Strong experience in performance tuning PostgreSQL databases to optimize writes and reads, Hands-on experience with PostgreSQL in GCP environments (preferred), Proven ability to evaluate database performance, isolate bottlenecks, and implement optimizations in transactional settings, Preferred Skills: Experience with Datastax Enterprise Service (DSE), Familiarity with cloud environments like GCP, Strong troubleshooting and analytical skills, Youll achieve more when you join HSBC, hsbc /careers HSBC is committed to building a culture where all employees are valued, respected and opinions count We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website, Issued by HSBC Software Development India Show

Posted 1 month ago

Apply

3.0 - 7.0 years

8 - 13 Lacs

Hyderabad

Work from Office

Job Description Some careers shine brighter than others, If youre looking for a career that will help you stand out, join HSBC and fulfil your potential Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further, HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions, We are currently seeking an experienced professional to join our team in the role of Consultant Specialist In this role, you will: Design and Develop Scalable Data Pipelines: Architect and implement end-to-end data workflows using Apache Airflow for orchestration, integrating multiple data sources and sinks across cloud and on-prem environments, BigQuery Data Modeling and Optimization: Build and optimize data models in Google BigQuery for performance and cost-efficiency, including partitioning, clustering, and materialized views to support analytics and reporting use cases, ETL/ELT Development and Maintenance: Design robust ETL/ELT pipelines to extract, transform, and load structured and semi-structured data, ensuring data quality, reliability, and availability, Cloud-Native Engineering on GCP: Leverage GCP services like Cloud Storage, Pub/Sub, Dataflow, and Cloud Functions to build resilient, event-driven data workflows, CI/CD and Automation: Implement CI/CD for data pipelines using tools like Cloud Composer (managed Airflow), Git, and Terraform, ensuring automated deployment and versioning of workflows, Data Governance and Security: Ensure proper data classification, access control, and audit logging within GCP, adhering to data governance and compliance standards, Monitoring and Troubleshooting: Build proactive monitoring for pipeline health and data quality using tools such as Stackdriver (Cloud Monitoring) and custom Airflow alerting mechanisms, Collaboration and Stakeholder Engagement: Work closely with data analysts, data scientists, and business teams to understand requirements and deliver high-quality, timely data products, Requirements To be successful in this role, you should meet the following requirements: over all 5+years of experience Mandatory 2+ hands on working experience on GCP Bigquery (Mandatory) Mandatory 2+ hands on working experience on Apache Airflow (Mandatory) Mandatory 2+ hands on working experience on Python (Mandatory) Mandatory 2+ hands on working experience on Linux/Unix (Mandatory) Mandatory 2+ hands on working experience on PL/SQL Scripting (Mandatory) Mandatory 2+ hands on working experience on ETL tools (Mandatory)(Mandatory) Data stage/ Informatica/ Prophecy, GCP Certification on ACE (Associate Cloud Engineer) is added advantage, Youll achieve more when you join HSBC, hsbc /careers HSBC is committed to building a culture where all employees are valued, respected and opinions count We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website, Issued by HSBC Software Development India Show

Posted 1 month ago

Apply

10.0 - 15.0 years

12 - 20 Lacs

Pune

Hybrid

Database Developer Company:Kiya.ai Work Location: Pune Work Mode:Hybrid JD: DataStrong knowledge of and hands-on development experience in Oracle PLSQL - Strong knowledge of and hands-on development *** experience SQL analytic functions*** - Experience with *** developing complex, numerically-intense business logic *** - Good knowledge of & experience in database performance tuning - Fluency in UNIX scripting Good-to-have - Knowledge of/experience in any of python, Hadoop/Hive/Impala, horizontally scalable databases, columnar databases - Oracle certifications - Any of DevOps tools/techniques CICD, Jenkins/GitLab, source control/git, deployment automation such Liquibase - Experience with Productions issues/deployments **Interested candidates drop your resume to saarumathi.r@kiya.ai **

Posted 1 month ago

Apply

3.0 - 4.0 years

3 - 6 Lacs

Bengaluru

Work from Office

Experience: 3+ years in SQL development Availability: Weekdays during daytime hours Work Mode: Remote Skills: Complex query writing, performance tuning, large dataset handling, business logic understanding. Work from home

Posted 1 month ago

Apply

10.0 - 17.0 years

20 - 30 Lacs

Pune

Hybrid

We have opening for " Princpal IT Engineer Applications - Data Engineer " role, with one of the top US Product based MNC, Pune. Total exp.-9-12 years,NP-upto 60 days Shift timing- 2 PM to 11 PM Location- Pune, Hinjewadi (Phase 2) PFB Must have skills : Minimum of 8 years of hands-on experience in software or data engineering roles Deep understanding of data modeling and data architecture design; must be a well-rounded technical expert Strong expertise in SQL scripting and performance tuning within large-scale data environments Proven experience working with high-volume data systems Demonstrated ability to solve complex problems using Informatica Excellent communication skills with the ability to clearly articulate and explain technical scenarios Experience in building data products or solutions from the ground up Proficient in Python, Shell scripting, and PL/SQL, with a strong focus on automation Must have strong hands-on experience; not seeking candidates focused on team management or leadership roles Good to have skills: Experience supporting Medicare, Medicaid, or other regulatory healthcare data platforms Nice to have certifications in Informatica Cloud, Oracle, Databricks, and cloud platforms (e.g., Azure, AWS) Kindly mail your CV's @ silpa.pa@peoplefy.com

Posted 1 month ago

Apply

2.0 - 4.0 years

4 - 6 Lacs

Chennai

Work from Office

Job Summary: This position holder is responsible for leading the development, implementation, and maintenance of the organization's BI reporting. This includes designing and building BI dashboards, reports, and visualizations to provide actionable insights to business stakeholders. The Lead BI will work closely with cross-functional teams to gather requirements, understand business needs, and ensure data accuracy and consistency across the organization. Job Location: Chennai Qualification: Bachelor's degree in computer science, Information Systems, Statistics. Experience Required: Proven experience as a Power BI expert. Minimum 2-4 years of relevant work experience in Power BI, DAX, and SQL. Strong stakeholder management and understanding the BI needs. Skills Required: Good problem solving and presentation skills. Effective communication skills •Roles & Responsibilities: Translate business needs to technical specifications Design, build and deploy Power BI solutions. Maintain and support data analytics platforms Develop and execute database queries. Working on Finance projects will be an added advantage. Lead collaboration with business to understand, define and develop BI and data solutions. Developing the reports using Power BI, Power Pivot and SSRS. Implementation Drill Through in Power BI Report Adding the feeds and scheduling the jobs. Experience in building reporting dashboards SSRS. Scheduled Automatic refresh and scheduling refresh in Power BI service. Write calculated columns, Measure queries in Power BI desktop to show good data analysis. Create stored procedures and SQL queries to pull data into Power Pivot model.

Posted 1 month ago

Apply

1.0 - 4.0 years

1 - 5 Lacs

Bengaluru

Work from Office

Role Overview: Interact with the Skyhigh Security Cloud customers via phone/email and provide the highest level of urgency to resolve customer issues in timely manner. Provide remote support on our product and resolve product related issues during POC & post-deployment phase, through research and troubleshooting. Debugging system level problems in a multi-vendor, multi-protocol network environment with high- level technical expertise on complex issues. Work closely with Support Escalation engineers to resolve critical issues. Evaluate the scope for timely escalation and ensure that critical problems are addressed as per the priority. Document all technical issues, analysis and communication with the customer and ensure that the documentation is crystal clear with action items, CTAs. To Fly High in this role, you have : A bachelors degree in computer science or information technology with 1-4 years of technical support experience in a large enterprise organization. An excellent understanding of technical support processes and customer management Superior multi-tasking skills with a strong ability to work well under pressure Solid grasp of TCP/IP, HTTPs, SSO-SAML, SAASunderstanding Excellent in-depth knowledge of Networking & Security Concepts A strong demonstrated detail-orientation toward quality, results, and goal achievement An excellent understanding of OSI Model, TCP/IP protocol suite (IP, ICMP, TCP, UDP, SNMP, FTP, TFTP, SMTP An excellent understanding on application layer protocols (HTTP-HTTPs/SSL), PKI, Network Security Firewalls/Proxy and SIEMs Demonstrated experience in packet capture/analysis with tcpdump and Wireshark Demonstrated experience in reading and analyzing log files It would be great if you also have: Industry relevant certification such as AWS, Azure, GCP (Google Cloud Platform), CISSP, Cisco, CEH (Certified Ethical Hacker) Knowledge of Java and SQL Scripting language (e.g., Perl, python) experience An understanding of cloud platforms like Azure, O365 suite, AWS (Amazon Web Services), Salesforce A good understanding of Linux/Unix Company Benefits and Perks: We believe that the best solutions are developed by teams who embrace each other's unique experiences, skills, and abilities. We work hard to create a dynamic workforce where we encourage everyone to bring their authentic selves to work every day. We offer a variety of social programs, flexible work hours and family-friendly benefits to all of our employees. Retirement Plans Medical, Dental and Vision Coverage Paid Time Off Paid Parental Leave Support for Community Involvement

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

Join us as a Senior ETL Test Engineer at Barclays, responsible for supporting the successful delivery of location strategy projects to plan, budget, agreed quality and governance standards. You'll spearhead the evolution of our digital landscape, driving innovation and excellence. You will harness cutting-edge technology to revolutionize our digital offerings, ensuring unparalleled customer experiences. To be successful as a Senior ETL Test Engineer, you should have experience with expertise in ETL tool e.g. Informatica. Develop and execute ETL test cases, test scripts, and data validation scenarios. Validation on data extraction, transformation, and loading along with data completeness. Test Automation experience in developing and implementing ETL test automation scripts using Python, SQL scripting, QuerySurge, Unix, and Shall Scripts. Automate comparison, schema validation, and regression testing. Integrate test automation with CICD pipeline (Jenkins, Gitlab, DevOps). Optimize and maintain automated test suite for scalability and performance. Understand requirements, user stories, and able to relate them with the design document. Work closely with business analyst, Dev team to define the test scope. Maintain Test Plan, Test Data, and Automation in version control. Document best practices, lessons learned, and continuous improvement strategies. Identify and log defects via JIRA & defect management. Work with business analyst and developers to troubleshoot data issues and pipeline failures. Provide a detailed report on test execution, coverage, and defect analysis. Understanding of agile development/test methodology and practice it in day-to-day work. Unearth gaps between business requirements and User stories. Ensure ETL process adheres to data privacy and compliance. Validate data masking encryption and access control. Audit and data Recon testing to track the data modification. Some other highly valued skills may include preferable earlier experience in coding with an engineering background. Detail understanding of Cloud technology viz AWS, Confluent Kafka. Good if have hands-on experience in BDD/TDD. You may be assessed on key critical skills relevant for success in the role, such as risk and controls, change and transformation, business acumen, strategic thinking, and digital and technology, as well as job-specific technical skills. This role is based out of Pune. Purpose of the role To design, develop, and execute testing strategies to validate functionality, performance, and user experience, while collaborating with cross-functional teams to identify and resolve defects, and continuously improve testing processes and methodologies, to ensure software quality and reliability. Accountabilities Development and implementation of comprehensive test plans and strategies to validate software functionality and ensure compliance with established quality standards. Creation and execution automated test scripts, leveraging testing frameworks and tools to facilitate early detection of defects and quality issues. Collaboration with cross-functional teams to analyze requirements, participate in design discussions, and contribute to the development of acceptance criteria, ensuring a thorough understanding of the software being tested. Root cause analysis for identified defects, working closely with developers to provide detailed information and support defect resolution. Collaboration with peers, participate in code reviews, and promote a culture of code quality and knowledge sharing. Stay informed of industry technology trends and innovations and actively contribute to the organization's technology communities to foster a culture of technical excellence and growth. Analyst Expectations To perform prescribed activities in a timely manner and to a high standard consistently driving continuous improvement. Requires in-depth technical knowledge and experience in their assigned area of expertise. Thorough understanding of the underlying principles and concepts within the area of expertise. They lead and supervise a team, guiding and supporting professional development, allocating work requirements, and coordinating team resources. If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviors to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviors are: L Listen and be authentic, E Energize and inspire, A Align across the enterprise, D Develop others. OR for an individual contributor, they develop technical expertise in the work area, acting as an advisor where appropriate. Will have an impact on the work of related teams within the area. Partner with other functions and business areas. Takes responsibility for the end results of a team's operational processing and activities. Escalate breaches of policies/procedure appropriately. Take responsibility for embedding new policies/procedures adopted due to risk mitigation. Advise and influence decision making within own area of expertise. Take ownership of managing risk and strengthening controls in relation to the work you own or contribute to. Deliver your work and areas of responsibility in line with relevant rules, regulation and codes of conduct. Maintain and continually build an understanding of how own sub-function integrates with the function, alongside knowledge of the organization's products, services, and processes within the function. Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organization sub-function. Resolve problems by identifying and selecting solutions through the application of acquired technical experience and will be guided by precedents. Guide and persuade team members and communicate complex/sensitive information. Act as the contact point for stakeholders outside of the immediate function while building a network of contacts outside the team and external to the organization. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset to Empower, Challenge, and Drive the operating manual for how we behave.,

Posted 1 month ago

Apply

0.0 - 3.0 years

0 Lacs

kochi, kerala

On-site

You will be responsible for functions, activities, and skills required for the analysis, design, coding, integration, testing, and maintenance of Intelligent Document Processing modules and systems. Your role will involve building NLP based solutions for query and document analysis, processing, information extraction, and document classification, as well as context-based information retrieval. Additionally, you will conduct research to advance the state-of-the-art Deep learning and NLP technologies. As an AI Intern at Xerox Holdings Corporation, you will work in Kochi, India, in a hybrid work mode with timings from 10 AM to 7 PM (IST). The ideal candidate for this role should have 6 months to 1 year of experience and hold a qualification of B. Tech /MCA /BCA. In this role, you will need proficiency and experience working in the technical area of Intelligent Document Processing, including Digitization, OCR/ICR/OMR, LLM, Classification Methodologies, Data Extraction Methodologies, ML, AI, NLP, and more. Experience in designing and developing highly scalable templates and training documents on IDP for efficient data extraction from semi/un-structured pdf or images is essential. Additionally, experience in Docker and flask APIs, Redis, and Celery is required. You will closely work with Solution Architects/Team leads, prepare technical design documents, and implement automated deployment. Understanding and practicing AGILE Methodologies, working as part of the Software Development Lifecycle (SDLC) using Code Management & Release Tools, and proficiency in working with Relational Databases and SQL Scripting are key aspects of this role. Clear understanding of Architecture and infra requirements and setup is necessary. Experience or proficiency in .NET (C#, VB, C++, Java) & Python development languages would be useful. This position offers an opportunity to contribute to the advancement of digital transformation, augmented reality, robotic process automation, additive manufacturing, Industrial Internet of Things, and cleantech at Xerox Holdings Corporation. If you are passionate about innovation and have the required technical skills and experience, we encourage you to apply for this exciting role. Learn more about Xerox at www.xerox.com and explore our commitment to diversity and inclusion.,

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies