Home
Jobs
Companies
Resume

3 Airbyte Jobs

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5 - 10 years

17 - 30 Lacs

Hyderabad

Remote

Naukri logo

At Mitratech, we are a team of technocrats focused on building world-class products that simplify operations in the Legal, Risk, Compliance, and HR functions of Fortune 100 companies. We are a close-knit, globally dispersed team that thrives in an ecosystem that supports individual excellence and takes pride in its diverse and inclusive work culture centered around great people practices, learning opportunities, and having fun! Our culture is the ideal blend of entrepreneurial spirit and enterprise investment, enabling the chance to move at a rapid pace with some of the most complex, leading-edge technologies available. Given our continued growth, we always have room for more intellect, energy, and enthusiasm - join our global team and see why it's so special to be a part of Mitratech! Job Description We are seeking a highly motivated and skilled Analytics Engineer to join our dynamic data team. The ideal candidate will possess a strong background in data engineering and analytics, with hands-on experience in modern analytics tools such as Airbyte, Fivetran, dbt, Snowflake, Airflow, etc. This role will be pivotal in transforming raw data into valuable insights, ensuring data integrity, and optimizing our data infrastructure to support the organization's data platform. Essential Duties & Responsibilities Data Integration and ETL Processes: Design, implement, and manage ETL pipelines using tools like Airbyte and Fivetran to ensure efficient and accurate data flow from various sources into our Snowflake data warehouse. Maintain and optimize existing data integration workflows to improve performance and scalability. Data Modeling and Transformation: Develop and maintain data models using dbt / dbt Cloud to transform raw data into structured, high-quality datasets that meet business requirements. Ensure data consistency and integrity across various datasets and implement data quality checks. Data Warehousing: Manage and optimize our Redshift / Snowflake data warehouses, ensuring it meets performance, storage, and security requirements. Implement best practices for data warehouse management, including partitioning, clustering, and indexing. Collaboration and Communication: Work closely with data analysts, data scientists, and business stakeholders to understand data requirements and deliver solutions that meet their needs. Communicate complex technical concepts to non-technical stakeholders in a clear and concise manner. Continuous Improvement: Stay updated with the latest developments in data engineering and analytics tools, and evaluate their potential to enhance our data infrastructure. Identify and implement opportunities for process improvements, automation, and optimization within the data pipeline. Requirements & Skills: Education and Experience: Bachelor's degree in Computer Science, Information Systems, Data Science, or a related field. 3-5 years of experience in data engineering or analytics engineering roles. Experience in AWS and DevOps is a plus. Technical Skills: Proficiency with modern ETL tools such as Airbyte and Fivetran. Must have experience with dbt for data modeling and transformation. Extensive experience working with Snowflake or similar cloud data warehouses. Solid understanding of SQL and experience writing complex queries for data extraction and manipulation. Familiarity with Python or other programming languages used for data engineering tasks. Analytical Skills: Strong problem-solving skills and the ability to troubleshoot data-related issues. Ability to understand business requirements and translate them into technical specifications. Soft Skills: Excellent communication and collaboration skills. Strong organizational skills and the ability to manage multiple projects simultaneously. Detail-oriented with a focus on data quality and accuracy. We are an equal-opportunity employer that values diversity at all levels. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, national origin, age, sexual orientation, gender identity, disability, or veteran status.

Posted 2 months ago

Apply

3 - 6 years

5 - 8 Lacs

Gurgaon

Work from Office

Naukri logo

We are looking for Data Engineer who like to innovate and seek complex problems. We recognize that strength comes from diversity and will embrace your unique skills, curiosity, drive, and passion while giving you the opportunity to grow technically and as an individual. Design is an iterative process, whether for UX, services or infrastructure. Our goal is to drive up modernizing and improving application capabilities. Job Responsibilities As a Data Engineer, you will be joining our Data Engineering & Modernization team transforming our global financial network and improving our data products and services we provide to our internal customers. This team will leverage cutting edge data engineering & modernization techniques to develop scalable solutions for managing data and building data products. In this role, you are expected to Involve from inception of projects to understand requirements, architect, develop, deploy, and maintain data. Work in a multi-disciplinary, agile squad which involves partnering with program and product managers to expand product offering based on business demands. Focus on speed to market and getting data products and services in the hands of our stakeholders and passion to transform financial industry is key to the success of this role. Maintain a positive and collaborative working relationship with teams within the NCR Atleos technology organization, as well as with wider business. Creative and inventive problem-solving skills for reduced turnaround times are required, and valued, and will be a major part of the job. And Ideal candidate would have: BA/BS in Computer Science or equivalent practical experience Experience applying machine learning and AI techniques on modernizing data and reporting use cases. Overall 3+ years of experience on Data Analytics or Data Warehousing projects. At least 2+ years of Cloud experience on AWS/Azure/GCP, preferred Azure. Microsoft Azure, ADF, Synapse. Programming in Python, PySpark, with experience using pandas, ml libraries etc. Data streaming with Flink/Spark structured streaming. Open-source orchestration frameworks like DBT, ADF, AirFlow Open-source data ingestion frameworks like Airbyte, Debezium Experience migrating from traditional on-prem OLTP/OLAP databases to cloud native DBaaS and/or NoSQL databases like Cassandra, Neo4J, Mongo DB etc. Deep expertise operating in a cloud environment, and with cloud native databases like Cosmos DB, Couchbase etc. Proficiency in various data modelling techniques, such as ER, Hierarchical, Relational, or NoSQL modelling. Excellent design, development, and tuning experience with SQL (OLTP and OLAP) and NoSQL databases. Experience with modern database DevOps tools like Liquibase or Redgate Flyway or DBmaestro. Deep understanding of data security and compliance, and related architecture Deep understanding and strong administrative experience with distributed data processing frameworks such as Hadoop, Spark, and others Experience with programming languages like Python, Java, Scala, and machine learning libraries. Experience with dev ops tools like Git, Maven, Jenkins, GitHub Actions, Azure DevOps Experience with Agile development concepts and related tools. Ability to tune and trouble shoot performance issues across the codebase and database queries. Excellent problem-solving skills, with the ability to think critically and creatively to develop innovative data solutions. Excellent written and strong communication skills, with the ability to effectively convey complex technical concepts to a diverse audience. Passion for learning with a proactive mindset, with the ability to work independently and collaboratively in a fast-paced, dynamic environment. Additional Skills: Leverage machine learning and AI techniques on operationalizing data pipelines and building data products. Provide data services using APIs. Containerization data products and services using Kubernetes and/or Docker.

Posted 2 months ago

Apply

6 - 11 years

25 - 35 Lacs

Mumbai Suburbs, Thane, Mumbai (All Areas)

Work from Office

Naukri logo

Experience: 6 to 12 years. Data building tool (DBT) experience is mandatory. Experience of ETL/ELT processes, data modelling, change data capture (CDC), SQL, Airbyte, Snowflake, Cloud computing skills is required.

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies