Jobs
Interviews

10 Dlt Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 11.0 years

0 Lacs

karnataka

On-site

As a Data Engineer, you will be responsible for leading the design, development, and implementation of scalable data pipelines and ELT processes using tools like Databricks, DLT, dbt, Airflow, and others. Your role will involve collaborating with stakeholders to understand data requirements, optimizing existing data pipelines, and ensuring data quality, reliability, and performance. Additionally, you will develop and enforce data engineering best practices, mentor junior data engineers, and stay updated with industry trends to recommend improvements. To qualify for this position, you should have a Bachelor's degree in Computer Science, Information Technology, Management Information Systems (MIS), Data Science, or a related field, along with at least 7 years of experience in data engineering and/or architecture focusing on big data technologies. You must possess extensive production experience with Databricks, Apache Spark, and other related technologies, as well as familiarity with orchestration and ELT tools like Airflow, dbt, etc. Strong SQL knowledge, proficiency in programming languages such as Python, Scala, or Java, and a solid understanding of data warehousing concepts are essential. Experience with cloud platforms like Azure, AWS, Google Cloud, excellent problem-solving skills, and the ability to work in a fast-paced, collaborative environment are required. Moreover, you should have strong communication and leadership skills to effectively mentor and guide team members. Preferred qualifications for this role include experience with machine learning and data science workflows, knowledge of data governance and security best practices, and certifications in Databricks, Azure, Google Cloud, or related technologies. If you meet the experience requirements, possess good communication skills, and have experience in Pyspark, we encourage you to apply for this position.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

ahmedabad, gujarat

On-site

The ideal candidate should have a minimum of 3 years of experience in Data Engineering with proven hands-on experience in ETL pipelines, specifically possessing end-to-end ownership. It is essential for the candidate to exhibit deep expertise in AWS Resources such as EC2, Athena, Lambda, and Step Functions as this is critical to the role. Proficiency in MySQL is non-negotiable. Moreover, experience in Docker including setup, deployment, and troubleshooting is required. Experience with Airflow or any modern orchestration tool, PySpark, Python Ecosystem, SQL Alchemy, DuckDB, PyArrow, Pandas, Numpy, and DLT (Data Load Tool) would be considered advantageous. The successful candidate should be a proactive builder, capable of working independently while maintaining effective communication. Thriving in fast-paced startup environments, you should prioritize ownership and impact over just writing code. Please include the code word "Red Panda" in your application to indicate that you have carefully reviewed this section. In this role, you will be responsible for architecting, constructing, and optimizing robust data pipelines and workflows. You will take ownership of configuring, optimizing, and troubleshooting AWS resources. Collaboration with product and engineering teams will be crucial to deliver quick business impact. The emphasis will be on automating and scaling data processes to eliminate manual work, thereby laying the foundation for making informed business decisions. Only serious and relevant applicants will be considered for this position.,

Posted 3 weeks ago

Apply

7.0 - 12.0 years

6 - 16 Lacs

pune, bengaluru

Work from Office

Job Description: Python, SQL, ADF, Databricks ( PySpark, DLT, Unity Catalog, Performance Tuning, Cost Optimization ) along with leadership(self Databricks Associate or professional Certification.

Posted 3 weeks ago

Apply

5.0 - 10.0 years

7 - 10 Lacs

bengaluru, karnataka, india

On-site

SW Integration for Advanced Driver Assistance System including BSW libraries, application software and AUTOSAR software stack. Build environment setup and maintenance. CI/CD pipeline operational maintenance. Debugging the integration issues, perform worst case analysis and performance evaluation. Execute the standard tests provided (networking, security, diagnostics ...), analyze the results and address the bugs to the internal and external development partners. Implement and execute the test cases. You analyze the test results and enter corresponding bugs into the bug tracking system. Need to coordinate with internal and external development teams, actively drive bug resolution. Must have Experience: Deep Expertise in CI topics e.g CI jobs, Debugging and Code Dump analysis. Expert hands-on experience with Bazel and at least one CI framework. Ansible Python scripting and test automation Experience with automotive Diagnostic Log and Trace (DLT) and Automotive Diagnosis. Experience in deploying automotive software on ECU via industrial standard tooling. Experience in Adaptive/Classic Autosar framework, IPC technologies. Experience with POSIX OS as well as Linux/QNX platforms. Expert knowledge of AAS standard, overall vehicle EE system integration.

Posted 3 weeks ago

Apply

6.0 - 8.0 years

6 - 12 Lacs

gurgaon, haryana, india

On-site

About us -Coders Brain is a global leader in its services, digital and business solutions that partners with its clients to simplify, strengthen and transform their businesses. We ensure the highest levels of certainty and satisfaction through a deep-set commitment to our clients, comprehensive industry expertise and a global network of innovation and delivery centers. We achieved our success because of how successfully we integrate with our clients. Quick Implementation - We offer quick implementation for the new onboarding client. Experienced Team - We've built an elite and diverse team that brings its unique blend of talent, expertise, and experience to make you more successful, ensuring our services are uniquely customized to your specific needs. One Stop Solution - Coders Brain provides end-to-end solutions for the businesses at an affordable price with uninterrupted and effortless services. Ease of Use - All of our products are user friendly and scalable across multiple platforms. Our dedicated team at Coders Brain implements keeping the interest of enterprise and users in mind. Secure - We understand and treat your security with utmost importance. Hence we blend security and scalability in our implementation considering long term impact on business benefit. Exp- 6+ Yrs Role- Senior Data Engineer Location- Gurgaon Permanent-Codersbrain Technology Pvt Ltd Client:- EMB Global JobDescription Key Skills & Expertise: ? Cloud Platforms: Azure / AWS with strong experience in Databricks. ? Programming & Big Data: Proficiency in Python, Scala, Spark, PySpark. ? Databases & Querying: Advanced expertise in SQL along with exposure to Hive, HBase, Impala, Parquet. ? Workflow Orchestration: Hands-on experience with Apache Airflow. ? Streaming & Messaging Systems: Exposure to Kafka for real-time data processing. ? CI/CD & DevOps: Practical experience in Jenkins / Bamboo, version control (GitHub / Bitbucket), and artifact management (Nexus). ? Delta Live Tables (DLT): Knowledge or experience preferred. If you're interested then please share the below-mentioned details : oCurrent CTC: oExpected CTC: oCurrent Company: oNotice Period: oCurrent Location: oPreferred Location: oTotal-experience: oRelevant experience: oHighest qualification: oDOJ(If Offer in Hand from Other company):

Posted 3 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

You have 5+ years of experience in designing and building data pipelines using Apache Spark, Databricks, or equivalent big data frameworks. You possess hands-on expertise with streaming and messaging systems like Apache Kafka, Confluent Cloud, RabbitMQ, or Azure Event Hub. Your experience includes creating producers, consumers, and topics, and integrating them into downstream processing. You have a deep understanding of relational databases and Change Data Capture (CDC). You are proficient in SQL Server, Oracle, or other RDBMSs, and have experience capturing change events using tools like Debezium or native CDC tools, transforming them for downstream consumption. Your proficiency extends to programming languages such as Python, Scala, or Java, along with solid knowledge of SQL for data manipulation and transformation. You also have cloud platform expertise, including experience with Azure or AWS services for data storage, compute, and orchestration (e.g., ADLS, S3, Azure Data Factory, AWS Glue, Airflow, Databricks, DLT). Furthermore, you have knowledge of data modeling and warehousing, including familiarity with data Lakehouse architectures, Delta Lake, partitioning strategies, and performance optimization. You are also well-versed in version control and DevOps practices, with experience in Git, CI/CD pipelines, and the ability to automate deployment and manage infrastructure as code.,

Posted 1 month ago

Apply

3.0 - 23.0 years

0 Lacs

karnataka

On-site

About SimplyFI Softech: SimplyFI Softech is a product-based tech company that specializes in building smart and secure platforms utilizing Blockchain and AI technologies. The company primarily operates in the BFSI (Banking and Financial Services) sector, focusing on simplifying processes such as trade finance and loan origination. Role Overview: We are seeking a skilled Blockchain Developer with expertise in working with Hyperledger Fabric. As a Blockchain Developer at SimplyFI, you will be responsible for constructing and managing blockchain-based applications tailored for real-world enterprise applications. Responsibilities: - Building and deploying blockchain networks utilizing Hyperledger Fabric - Writing smart contracts (chaincode) in either Go or Node.js - Setting up and managing components like peers, orderers, channels, and Fabric CA - Developing APIs using Hyperledger SDK for seamless app integration - Collaborating with fellow developers, testers, and product teams - Assisting in troubleshooting, testing, and enhancing system performance - Documenting work and providing support during deployment processes Skills Required: - Practical experience with Hyperledger Fabric - Profound understanding of blockchain fundamentals and Distributed Ledger Technology (DLT) - Proficiency in writing chaincode using Go or Node.js - Familiarity with Docker, CouchDB, Git, and basic DevOps principles - Ability to connect backend systems to blockchain through APIs - Basic knowledge of PKI, MSP, and blockchain identities Nice to Have: - Experience with Hyperledger Besu, Indy, or Aries - Involvement in projects related to finance, supply chain, or trade - Awareness of CI/CD pipelines and monitoring tools Qualifications: - Bachelor's degree in Computer Science, IT, or a related field - Possession of a blockchain-related certification would be advantageous Experience: 2-3 Years Location: Bangalore Company: SimplyFI Softech Pvt. Ltd. Type: Full-Time Notice Period: Immediate or up to 15 days,

Posted 1 month ago

Apply

12.0 - 14.0 years

12 - 14 Lacs

Hyderabad, Bengaluru

Hybrid

Bachelors or master’s degree in computer science, Engineering, or a related field. 10+ years of overall experience and 8+ years of relevant in Data bricks, DLT, Py spark and Data modelling concepts-Dimensional Data Modelling (Star Schema, Snowflake Schema) Proficiency in programming languages such as Python, Py spark, Scala, SQL. Proficiency in DLT Proficiency in SQL Proficiency in Data Modelling concepts - Dimensional Data Modelling (Star Schema, Snowflake Schema) Strong understanding of distributed computing principles and experience with big data technologies such as Apache Spark. Experience with cloud platforms such as AWS, Azure, or GCP, and their associated data services. Proven track record of delivering scalable and reliable data solutions in a fast-paced environment. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills with the ability to work effectively in cross-functional teams. Good to have experience with containerization technologies such as Docker and Kubernetes. Knowledge of DevOps practices for automated deployment and monitoring of data pipelines

Posted 3 months ago

Apply

7.0 - 12.0 years

6 - 9 Lacs

Noida, New Delhi, Gurugram

Work from Office

Role : Company Secretary Department - CS & Legal Qualification - LLB & CS Experience - 8 - 10 Years Work Location - Gurugram (On Site) Preferred candidate - Equity Listed companies/NBFC Role & responsibilities To ensure compliance of the provisions of Companies Law and rules made thereunder; Thorough compliances with Secretarial Standards for Board / General Meetings. Thorough knowledge and experience of Listing and other SEBI related regulations like NCS Regulations, PIT, SAST and ESOP . Issuance and Listing of Non-Convertible Securities including Non-Convertible Debentures, Foreign Exchange Bonds etc. Involved in Fund raising from Banks/Financial Institutions etc . and interaction with finance department for legal documentation. Well versed with NSDL DLT platform and coordination with depositories for updation of documents. Liaising with MCA authorities, group companies, promoters, statutory and secretarial auditors, law firms. Advising on good governance practices and compliance of Corporate Governance norms Conceptualization, drafting and finalization of Annual Report of the Company. Advising Company and subsidiaries on secretarial matters. Liaising with subsidiaries for compiling data. Responsible for drafting, review and vetting of all legal agreements of the organization related to Vendor Overseeing routine Registrar and Transfer activities such as transfer, transmission and issuing duplicate share certificates and handling problematic cases related to investors grievances, agreement, NDAs, Client agreements, lease deeds etc. Assisting in drafting legal contracts and commercial agreements and ensure that contracts follow legal, regulatory, RBI and organizational policies. This list should not be regarded as exhaustive and the position holder will be expected to deliver other duties that are relevant and appropriate to this scope. Preferred candidate profile Should be well versed in handling MS Excel, MS Office, MCA Portal and Stock Exchanges Excellent interpersonal and relationship building problem-solving skills; A team player and self-motivated person. Strong analytical with Liasoning ability with govt. ministerial, legal authorities. Excellent Communication & writing skills 8-10 years of corporate experience Interested candidates can apply on the same or share their updated cv at Pooja.jain@satincreditcare.com

Posted 3 months ago

Apply

0.0 - 3.0 years

3 - 6 Lacs

mumbai

Work from Office

Listing support to NCD & issuance ISIN creation support DLT support SEBI LODR Ongoing compliance support to listed NCDs under SEBI Filing of PAS-6

Posted Date not available

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies