Bilvantis

Bilvantis is a fintech company specializing in customizable digital payment solutions and financial technology services for various industries.

4 Job openings at Bilvantis
Data Engineering (Snowflake, DBT & ADF) Hyderabad 7 - 12 years INR 18.0 - 27.5 Lacs P.A. Work from Office Full Time

SnowFlake Data Engineering (SnowFlake, DBT & ADF) Lead Programmer Analyst (Experience: 7 to 12 Years) We are looking for a highly self-motivated individual with SnowFlake Data Engineering (SnowFlake, DBT & ADF) Lead Programmer Analyst: At least 5+ years of experience in designing and developing Data Pipelines & Assets. Must have experience with at least one Columnar MPP Cloud data warehouse (Snowflake/Azure Synapse/Redshift) for at least 5 years. Experience in ETL tools like Azure Data factory, Fivetran / DBT for 4 years. Experience with Git and Azure DevOps. Experience in Agile, Jira, and Confluence. Solid understanding of programming SQL objects (procedures, triggers, views, functions) in SQL Server. Experience optimizing SQL queries a plus. Working Knowledge of Azure Architecture, Data Lake. Willingness to contribute to documentation (e.g., mapping, defect logs). Generate functional specs for code migration or ask right questions thereof. Hands on programmer with a thorough understand of performance tuning techniques. Handling large data volume transformations (order of 100 GBs monthly). Able to create solution / data flows to suit requirements. Produce timely documentation e.g., mapping, UTR, defect / KEDB logs etc. Self-starter & learner. Able to understand and probe for requirements. Tech experience expected. Primary: Snowflake, DBT (development & testing). Secondary: Python, ETL or any data processing tool. Nice to have - Domain experience in Healthcare. Should have good oral and written communication. Should be a good team player. Should be proactive and adaptive.

Lead Programmer Analyst/Technical Lead hyderabad,telangana 5 - 9 years INR Not disclosed On-site Full Time

We are looking for a highly self-motivated individual with VisionPlus (V+) (FAS) as a Technical Lead/Lead Programmer Analyst. You should have a minimum of 5 years of experience with hands-on experience in V+ FAS. Additionally, strong development skills in Mainframes including Cobol, JCL, VSAM, and CICS are required. Your responsibilities will include leading and guiding a team of 4 to 5 members, so team leading capabilities are essential. Knowledge in REXX, Eazy-retrieves, DB2, V+ ITS, Authorization & Interchange Functionalities, as well as an understanding of the SDLC and Agile methodologies, is necessary. Communication skills are crucial as you will be interacting with customers and providing daily status reports. Therefore, you should have good oral and written communication abilities. Strong analytical and problem-solving skills are a must-have for this role. As a valuable team member, you should be proactive, adaptive, and willing to take on leadership roles when necessary. Flexibility to work in shifts is also required.,

GCP Data Engineering hyderabad,pune 5 - 10 years INR 18.0 - 27.5 Lacs P.A. Work from Office Full Time

Job Description: GCP Data Engineer We are looking for a highly skilled and experienced GCP Data Engineer as a Lead Programmer Analyst. Experience should have 4 to 7 years of data engineering. Should have strong development skills in GCP services like BigQuery, DataProc, Dataflow, Dataform. Should have good knowledge in SQL. Should have good knowledge in Python. Should have good knowledge in Pyspark. Good Experience on Airflow, Git, CI/CD pipelines. Good to have Knowledge in Cloud SQL, Dataplex. Understanding on the SDLC. Understanding on the Agile methodologies. Communication with customer and producing the Daily status report. Should have good oral and written communication. Should be a good team player. Should be proactive and adaptive. Strong communication skills. Analytical & Problem-solving skills.

Data Engineering (Snowflake, DBT & ADF) hyderabad 5 - 10 years INR 18.0 - 27.5 Lacs P.A. Work from Office Full Time

SnowFlake Data Engineering (SnowFlake, DBT) (Experience: 4 to 10 Years) We are looking for a highly self-motivated individual with SnowFlake Data Engineering (SnowFlake, DBT ) At least 5+ years of experience in designing and developing Data Pipelines & Assets. Must have experience with at least one Columnar MPP Cloud data warehouse (Snowflake/Azure Synapse/Redshift) for at least 5 years. Experience in ETL tools like Azure Data factory, Fivetran / DBT for 3 years. Experience with Git and Azure DevOps. Experience in Agile, Jira, and Confluence. Solid understanding of programming SQL objects (procedures, triggers, views, functions) in SQL Server. Experience optimizing SQL queries a plus. Working Knowledge of Azure Architecture, Data Lake. Willingness to contribute to documentation (e.g., mapping, defect logs). Generate functional specs for code migration or ask right questions thereof. Hands on programmer with a thorough understand of performance tuning techniques. Handling large data volume transformations (order of 100 GBs monthly). Able to create solution / data flows to suit requirements. Produce timely documentation e.g., mapping, UTR, defect / KEDB logs etc. Self-starter & learner. Able to understand and probe for requirements. Tech experience expected. Primary: Snowflake, DBT (development & testing). Secondary: Python, ETL or any data processing tool. Nice to have - Domain experience in Healthcare. Should have good oral and written communication. Should be a good team player. Should be proactive and adaptive.

FIND ON MAP

Bilvantis