Jobs
Interviews

96 Fivetran Jobs - Page 3

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 7.0 years

10 - 20 Lacs

Bengaluru

Work from Office

Job Title: Data Engineer Dear Candidates, Greetings from ExxonMobil! Please copy and paste the below link into your browser to apply for the position in the company website. Link to apply: https://jobs.exxonmobil.com/job-invite/80614/ Please find the JD below, What role you will play in our team Design, build, and maintain data systems, architectures, and pipelines to extract insights and drive business decisions. Collaborate with stakeholders to ensure data quality, integrity, and availability. What you will do Support in developing and owning ETL pipelines within cloud data platforms Data extraction and transformation pipeline Automation using Python/Airflow/Azure Data Factory/Qlik/Fivetran Delivery of task monitoring and notification system for data pipeline status Supporting data cleansing, enrichment, and curation / enrichment activities to enable ongoing business use cases Developing and delivering data pipelines through a CI/CD delivery methodology Developing monitoring around pipelines to ensure uptime of data flows Optimization and refinement of current queries against Snowflake Working with Snowflake, MSSQL, Postgres, Oracle, Azure SQL, and other relational databases Work with different cloud databases such as Azure SQL, Azure PostgreSQL, Etc. Working with Change-Data-Capture ETL software to populate Snowflake such as Qlik and Fivetran Identification and remediation of failed and long running queries Development of large aggregate queries across a multitude of schemas About You Skills and Qualifications Experience with data processing / analytics, and ETL data transformation. Proficient in ingesting data to/from Snowflake, Azure storage account. Proficiency in at least one of the following languages: Python, C#, C++, F#, Java. Proficiency in SQL and NoSQL databases. Knowledge of SQL query development and optimization Demonstrated experience Snowflake, Qlik Replicate, Fivetran, Azure Data Explorer. Cloud azure experience (current/future) used (ADX, ADF, Databricks) Expertise with Airflow, Qlik, Fivetran, Azure Data Factory Management of Snowflake through DBT scripting Solid understanding of data strategies, including data management, data curation, and data governance Ability to quickly build relationships and credibility with business customers and agile teams A passion for learning about and experimenting with new technologies Confidence in creating and delivering technical presentations and training Excellent organization and planning skills Preferred Qualifications/ Experience Experience with data processing / analytics, and ETL data transformation. Proficient in ingesting data to/from Snowflake, Azure storage account. Proficiency in at least one of the following languages: Python, C#, C++, F#, Java. Proficiency in SQL and NoSQL databases. Knowledge of SQL query development and optimization Demonstrated experience Snowflake, Qlik Replicate, Fivetran, Azure Data Explorer. Cloud azure experience (current/future) used (ADX, ADF, Databricks) Expertise with Airflow, Qlik, Fivetran, Azure Data Factory Management of Snowflake through DBT scripting Solid understanding of data strategies, including data management, data curation, and data governance Ability to quickly build relationships and credibility with business customers and agile teams A passion for learning about and experimenting with new technologies Confidence in creating and delivering technical presentations and training Excellent organization and planning skills. Thanks & Regards, Anita

Posted 1 month ago

Apply

10.0 - 14.0 years

0 Lacs

vijayawada, andhra pradesh

On-site

As a Lead Data Engineer based in Vijayawada, Andhra Pradesh, you will be responsible for leveraging your extensive experience in data engineering and data architecture to design and develop end-to-end data solutions, data pipelines, and ETL processes. With a Bachelor's or Master's degree in Computer Science, Information Systems, or a related field, along with over 10 years of relevant experience, you will play a crucial role in ensuring the success of data projects. You will demonstrate your strong knowledge in data technologies such as Snowflake, Databricks, Apache Spark, Hadoop, Dbt, Fivetran, and Azure Data Factory. Your expertise in Python and SQL will be essential in tackling complex data challenges. Furthermore, your understanding of data governance, data quality, and data security principles will guide you in maintaining high standards of data management. In this role, your excellent problem-solving and analytical skills will be put to the test as you work both independently and collaboratively in an Agile environment. Your strong communication and leadership skills will be instrumental in managing projects, teams, and engaging in pre-sales activities. You will have the opportunity to showcase your technical leadership abilities by delivering solutions within defined timeframes and building strong client relationships. Moreover, your experience in complete project life cycle activities, agile methodologies, and working with globally distributed teams will be valuable assets in this position. Your proven track record of success in managing complex consulting projects and your ability to effectively communicate with technical and non-technical staff will contribute to the overall success of the team. If you are looking for a challenging role that combines technical expertise, leadership skills, and client engagement, this Lead Data Engineer position offers a dynamic opportunity to excel in a fast-paced and collaborative environment.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a Senior Auditor, Technology at LegalZoom, you will be an impactful member of the internal audit team, assisting in achieving the department's mission and objectives. Your role will involve evaluating technology risks in a dynamic environment, assessing the design and effectiveness of internal controls over financial reporting, and ensuring compliance with operational and regulatory requirements. You will document audit procedures and results following departmental standards and execute within agreed timelines. Additionally, you will provide advisory support to stakeholders on internal control considerations, collaborate with external auditors when necessary, and focus on continuous improvement of the audit department. Your commitment to integrity and ethics, coupled with a passion for the internal audit profession and LegalZoom's mission, are essential. Ideally, you hold a Bachelor's degree in computer science, information systems, or accounting, along with 3+ years of experience in IT internal audit and Sarbanes-Oxley compliance, particularly in the technology sector. Previous experience in a Big 4 accounting firm and internal audit at a public company would be advantageous. A professional certification such as CISA, CIA, CRISC, or CISSP is preferred. Strong communication skills, self-management abilities, and the capacity to work on multiple projects across different locations are crucial for this role. Familiarity with technologies like Oracle Cloud, AWS, Salesforce, Azure, and others is beneficial, along with reliable internet service for remote work. Join LegalZoom in making a difference and contributing to the future of accessible legal advice for all. LegalZoom is committed to diversity, equality, and inclusion, offering equal employment opportunities to all employees and applicants without discrimination based on any protected characteristic.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

You will be responsible for designing, developing, and maintaining dashboards and reports using Sigma Computing. Your main focus will be on collaborating with business stakeholders to understand data requirements and deliver actionable insights. It will be crucial for you to write and optimize SQL queries that run directly on cloud data warehouses. Additionally, enabling self-service analytics for business users via Sigma's spreadsheet interface and templates will be part of your responsibilities. You will need to apply row-level security and user-level filters to ensure proper data access controls. Furthermore, you will work closely with data engineering teams to validate data accuracy and ensure model alignment. Troubleshooting performance or data issues in reports and dashboards will also be a key aspect of your role. You will be expected to train and support users on Sigma best practices, tools, and data literacy. To excel in this role, you should have at least 5 years of experience in Business Intelligence, Analytics, or Data Visualization roles. Hands-on experience with Sigma Computing is highly preferred. Strong SQL skills and experience working with cloud data platforms such as Snowflake, BigQuery, or Redshift are essential. Familiarity with data modeling concepts and modern data stacks is required. Your ability to translate business requirements into technical solutions will be crucial. Knowledge of data governance, security, and role-based access controls is important. Excellent communication and stakeholder management skills are necessary for effective collaboration. Experience with tools like Looker, Tableau, Power BI, or similar ones will be beneficial for comparative insights. Familiarity with dbt, Fivetran, or other ELT/ETL tools is a plus. Exposure to Agile or Scrum methodologies would also be advantageous.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

You will be responsible for designing and implementing scalable Snowflake data warehouse architectures, which includes schema modeling and data partitioning. You will lead or support data migration projects from on-premise or legacy cloud platforms to Snowflake. Additionally, you will be developing ETL/ELT pipelines and integrating data using tools such as DBT, Fivetran, Informatica, Airflow, etc. It will be part of your role to define and implement best practices for data modeling, query optimization, and storage efficiency in Snowflake. Collaboration with cross-functional teams, including data engineers, analysts, BI developers, and stakeholders, to align architectural solutions will be essential. Ensuring data governance, compliance, and security by implementing RBAC, masking policies, and access control within Snowflake will also be a key responsibility. Working with DevOps teams to enable CI/CD pipelines, monitoring, and infrastructure as code for Snowflake environments will be part of your duties. Optimizing resource utilization, monitoring workloads, and managing the cost-effectiveness of the platform will also be under your purview. Staying updated with Snowflake features, cloud vendor offerings, and best practices is crucial. Qualifications & Skills: - Bachelor's or Master's degree in Computer Science, Information Systems, or a related field. - X years of experience in data engineering, data warehousing, or analytics architecture. - 3+ years of hands-on experience in Snowflake architecture, development, and administration. - Strong knowledge of cloud platforms (AWS, Azure, or GCP). - Solid understanding of SQL, data modeling, and data transformation principles. - Experience with ETL/ELT tools, orchestration frameworks, and data integration. - Familiarity with data privacy regulations (GDPR, HIPAA, etc.) and compliance. Qualifications: - Snowflake certification (SnowPro Core / Advanced). - Experience in building data lakes, data mesh architectures, or streaming data platforms. - Familiarity with tools like Power BI, Tableau, or Looker for downstream analytics. - Experience with Agile delivery models and CI/CD workflows.,

Posted 2 months ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

As a Data Engineer specializing in Snowflake architecture, you will be responsible for designing and implementing scalable data warehouse architectures, including schema modeling and data partitioning. Your role will involve leading or supporting data migration projects to Snowflake from on-premise or legacy cloud platforms. You will be developing ETL/ELT pipelines and integrating data using various tools such as DBT, Fivetran, Informatica, and Airflow. It will be essential to define and implement best practices for data modeling, query optimization, and storage efficiency within Snowflake. Collaboration with cross-functional teams, including data engineers, analysts, BI developers, and stakeholders, will be crucial to align architectural solutions effectively. Ensuring data governance, compliance, and security by implementing RBAC, masking policies, and access control within Snowflake will also be part of your responsibilities. Working closely with DevOps teams to enable CI/CD pipelines, monitoring, and infrastructure as code for Snowflake environments is essential. Your role will involve optimizing resource utilization, monitoring workloads, and managing the cost-effectiveness of the platform. Staying updated with Snowflake features, cloud vendor offerings, and best practices will be necessary to drive continuous improvement in data architecture. Qualifications & Skills: - Bachelor's or Master's degree in Computer Science, Information Systems, or a related field. - 5+ years of experience in data engineering, data warehousing, or analytics architecture. - 3+ years of hands-on experience in Snowflake architecture, development, and administration. - Strong knowledge of cloud platforms such as AWS, Azure, or GCP. - Solid understanding of SQL, data modeling, and data transformation principles. - Experience with ETL/ELT tools, orchestration frameworks, and data integration. - Familiarity with data privacy regulations (GDPR, HIPAA, etc.) and compliance. Additional Qualifications: - Snowflake certification (SnowPro Core / Advanced). - Experience in building data lakes, data mesh architectures, or streaming data platforms. - Familiarity with tools like Power BI, Tableau, or Looker for downstream analytics. - Experience with Agile delivery models and CI/CD workflows. This role offers an exciting opportunity to work on cutting-edge data architecture projects and collaborate with diverse teams to drive impactful business outcomes.,

Posted 2 months ago

Apply

10.0 - 14.0 years

0 Lacs

punjab

On-site

About Us We are a global climate technologies company engineered for sustainability. We create sustainable and efficient residential, commercial and industrial spaces through HVACR technologies. We protect temperature-sensitive goods throughout the cold chain. And we bring comfort to people globally. Best-in-class engineering, design and manufacturing combined with category-leading brands in compression, controls, software and monitoring solutions result in next-generation climate technology that is built for the needs of the world ahead. Whether you are a professional looking for a career change, an undergraduate student exploring your first opportunity, or recent graduate with an advanced degree, we have opportunities that will allow you to innovate, be challenged and make an impact. Join our team and start your journey today! JOB DESCRIPTION/ RESPONSIBILITIES: We are looking for Data Engineers:- Candidate must have a minimum of (10) years of experience in a Data Engineer role, including the following tools/technologies: Experience with relational (SQL) databases. Experience with data warehouses like Oracle, SQL & Snowflake. Technical expertise in data modeling, data mining and segmentation techniques. Experience with building new and troubleshooting existing data pipelines using tools like Pentaho Data Integration (PDI),ADF (Azure Data Factory),Snowpipe,Fivetran,DBT Experience with batch and real time data ingestion and processing frameworks. Experience with languages like Python, Java, etc. Knowledge of additional cloud-based analytics solutions, along with Kafka, Spark and Scala is a plus. * Develops code and solutions that transfers/transforms data across various systems * Maintains deep technical knowledge of various tools in the data warehouse, data hub, * and analytical tools. * Ensures data is transformed and stored in efficient methods for retrieval and use. * Maintains data systems to ensure optimal performance * Develops a deep understanding of underlying business systems involved with analytical systems. * Follows standard software development lifecycle, code control, code standards and * process standards. * Maintains and develops technical knowledge by self-training of current toolsets and * computing environments, participates in educational opportunities, maintains professional * networks, and participates in professional organizations related to their tech skills Systems Analysis * Works with key stakeholders to understand business needs and capture functional and * technical requirements. * Offers ideas that simplify the design and complexity of solutions delivered. * Effectively communicates any expectations required of stakeholders or other resources * during solution delivery. * Develops and executes test plans to ensure successful rollout of solution, including accuracy * and quality of data. Service Management * Effectively communicates to leaders and stakeholders any obstacles that occur during * solution delivery. * Defines and manages promised delivery dates. * Pro-actively research, analyzes, and predicts operational issues, informing leadership * where appropriate. * Offers viable options to solve unexpected/unknown issues that occur during * solution development and delivery. * EDUCATION / JOB-RELATED TECHNICAL SKILLS: * Bachelors Degree in Computer Science/Information Technology or equivalent * Ability to effectively communicate with others at all levels of the Company both verbally and in * writing. Demonstrates a courteous, tactful, and professional approach with employees and * others. * Ability to work in a large, global corporate structure Our Commitment to Our People Across the globe, we are united by a singular Purpose: Sustainability is no small ambition. Thats why everything we do is geared toward a sustainable futurefor our generation and all those to come. Through groundbreaking innovations, HVACR technology and cold chain solutions, we are reducing carbon emissions and improving energy efficiency in spaces of all sizes, from residential to commercial to industrial. Our employees are our greatest strength. We believe that our culture of passion, openness, and collaboration empowers us to work toward the same goal - to make the world a better place. We invest in the end-to-end development of our people, beginning at onboarding and through senior leadership, so they can thrive personally and professionally. Flexible and competitive benefits plans offer the right options to meet your individual/family needs . We provide employees with flexible time off plans, including paid parental leave (maternal and paternal), vacation and holiday leave. Together, we have the opportunity and the power to continue to revolutionize the technology behind air conditioning, heating and refrigeration, and cultivate a better future. Learn more about us and how you can join our team! Our Commitment to Diversity, Equity & Inclusion At Copeland, we believe having a diverse, equitable and inclusive environment is critical to our success. We are committed to creating a culture where every employee feels welcomed, heard, respected, and valued for their experiences, ideas, perspectives and expertise . Ultimately, our diverse and inclusive culture is the key to driving industry-leading innovation, better serving our customers and making a positive impact in the communities where we live. Equal Opportunity Employer ,

Posted 2 months ago

Apply

6.0 - 11.0 years

7 - 17 Lacs

Gurugram

Work from Office

We are heavily dependent on BigQuery/Snowflake, Airflow, Stitch/Fivetran, dbt , Tableau/Looker for our business intelligence and embrace AWS with some GCP. As a Data Engineer,Developing end to end ETL/ELT Pipeline.

Posted 2 months ago

Apply

10.0 - 20.0 years

25 - 30 Lacs

Bengaluru

Remote

Role & responsibilities Data Platform : Snowflake, dbt, Fivetran, Oracle OCI Visualization : Tableau Cloud & Identity : Azure, Microsoft Entra (Entra ID / Azure AD) Infrastructure as Code : OpenTofu (Terraform alternative) - Migration from Terraform Scripting & Monitoring : SQL, Python/Bash, monitoring tools

Posted 2 months ago

Apply

5.0 - 9.0 years

15 - 19 Lacs

Chennai

Work from Office

Senior Data Engineer - Azure Years of Experience : 5 Job location: Chennai Job Description : We are looking for a skilled and experienced Senior Azure Developer to join the team! As part of the team, you will be involved in the implementation of the ongoing and new initiatives for our company. If you love learning, thinking strategically, innovating,and helping others, this job is for you! Primary Skills : ADF,Databricks Secondary Skills : DBT,Python,Databricks,Airflow,Fivetran,Glue,Snowflake Role Description : Data engineering role requires creating and managing technological infrastructure of a data platform, be in-charge / involved in architecting, building, and managing data flows / pipelines and construct data storages (noSQL, SQL), tools to work with big data (Hadoop, Kafka), and integration tools to connect sources or other databases. Role Responsibility : l Translate functional specifications and change requests into technical specifications l Translate business requirement document, functional specification, and technical specification to related coding l Develop efficient code with unit testing and code documentation l Ensuring accuracy and integrity of data and applications through analysis, coding, documenting, testing, and problem solving l Setting up the development environment and configuration of the development tools l Communicate with all the project stakeholders on the project status l Manage, monitor, and ensure the security and privacy of data to satisfy business needs l Contribute to the automation of modules, wherever required l To be proficient in written, verbal and presentation communication (English) l Co-ordinating with the UAT team Role Requirement : l Proficient in basic and advanced SQL programming concepts (Procedures, Analytical functions etc.) l Good Knowledge and Understanding of Data warehouse concepts (Dimensional Modeling, change data capture, slowly changing dimensions etc.) l Knowledgeable in Shell / PowerShell scripting l Knowledgeable in relational databases, nonrelational databases, data streams, and file stores l Knowledgeable in performance tuning and optimization l Experience in Data Profiling and Data validation l Experience in requirements gathering and documentation processes and performing unit testing l Understanding and Implementing QA and various testing process in the project l Knowledge in any BI tools will be an added advantage l Sound aptitude, outstanding logical reasoning, and analytical skills l Willingness to learn and take initiatives l Ability to adapt to fast-paced Agile environment Additional Requirement : l Demonstrated expertise as a Data Engineer, specializing in Azure cloud services. l Highly skilled in Azure Data Factory, Azure Data Lake, Azure Databricks, and Azure Synapse Analytics. l Create and execute efficient, scalable, and dependable data pipelines utilizing Azure Data Factory. l Utilize Azure Databricks for data transformation and processing. l Effectively oversee and enhance data storage solutions, emphasizing Azure Data Lake and other Azure storage services. l Construct and uphold workflows for data orchestration and scheduling using Azure Data Factory or equivalent tools. l Proficient in programming languages like Python, SQL, and conversant with pertinent l scripting languages.

Posted 2 months ago

Apply

5.0 - 9.0 years

13 - 19 Lacs

Chennai

Work from Office

Senior Data Engineer - DBT and Snowflake Years of Experience : 5 Job location: Chennai Role Description: Data engineering role requires creating and managing technological infrastructure of a data platform, be in-charge / involved in architecting, building, and managing data flows / pipelines and construct data storages (noSQL, SQL), tools to work with big data (Hadoop, Kafka), and integration tools to connect sources or other databases. Should hold minimum 5 years of experience in DBT and Snowflake. Role Responsibility: Translate functional specifications and change requests into technical specifications Translate business requirement document, functional specification, and technical specification to related coding Develop efficient code with unit testing and code documentation Role Requirement: Proficient in basic and advanced SQL programming concepts (Procedures, Analytical functions etc.) Good Knowledge and Understanding of Data warehouse concepts (Dimensional Modeling, change data capture, slowly changing dimensions etc.) Knowledgeable in Shell / PowerShell scripting Knowledgeable in relational databases, non-relational databases, data streams, and file stores Knowledgeable in performance tuning and optimization Experience in Data Profiling and Data validation Experience in requirements gathering and documentation processes and performing unit testing Understanding and Implementing QA and various testing process in the project Additional Requirement: Design, develop, and maintain scalable data models and transformations using DBT in conjunction with Snowflake, ensure the effective transformation and load data from diverse sources into data warehouse or data lake. Implement and manage data models in DBT, guarantee accurate data transformation and alignment with business needs. Utilize DBT to convert raw, unstructured data into structured datasets, enabling efficient analysis and reporting. Write and optimize SQL queries within DBT to enhance data transformation processes and improve overall performance. Establish best DBT processes to improve performance, scalability, and reliability. Expertise in SQL and a strong understanding of Data Warehouse concepts and Modern Data Architectures. Familiarity with cloud-based platforms (e.g., AWS, Azure, GCP). Migrate legacy transformation code into modular DBT data models.

Posted 2 months ago

Apply

8.0 - 12.0 years

25 - 35 Lacs

Hyderabad

Remote

Role: Fivetran Developer Role Type: CONTRACTUAL Location: Remote Need to work in Dubai Time Zone Key Responsibilities: Configure and manage Fivetran connectors for data replication from SQL Server, Oracle, Salesforce, and APIs. Set up, monitor, and troubleshoot CDC (Change Data Capture) jobs on source systems to ensure real-time or near-real-time data replication. Ensure accurate and reliable data delivery to Snowflake, Azure Data Lake, and Iceberg formats/catalogs. Work closely with source system owners to validate CDC setup and permissions. Monitor Fivetran pipelines for performance, reliability, and data consistency. Resolve replication issues, data sync failures, and schema changes in coordination with business and technical stakeholders. Document all pipeline configurations and changes. Required Experience: 2+ years of hands-on experience with Fivetran , specifically for data replication use cases. Strong expertise in configuring CDC : SQL Server, Oracle, Salesforce, and/or APIs. Experience replicating data into Snowflake and Azure Data Lake ; familiarity with Iceberg formats/catalogs is a plus. Solid understanding of data replication, sync frequency, and incremental loads. Basic SQL skills for data validation and troubleshooting. Strong communication and documentation abilities. Preferred: Experience with data security and compliance during data movement. Familiarity with cloud data ecosystems (Azure, AWS, GCP). Position Overview: We are seeking a Fivetran Data Replication Engineer responsible for configuring and managing data replication pipelines using Fivetran. The primary focus will be on replicating data from SQL Server, Oracle, Salesforce, and various APIs into Snowflake, Azure Data Lake, and Fivetran Iceberg formats/catalogs. This role requires strong experience in setting up and troubleshooting Change Data Capture (CDC) jobs on source systems.

Posted 2 months ago

Apply

3.0 - 8.0 years

20 - 35 Lacs

Hyderabad, Pune

Work from Office

Technical Data Analyst Snowflake, SQL, Python (Finance Data Warehouse) Job Description We are seeking a highly skilled *Technical Data Analyst* to join our team and play a key role in building a *single source of truth* for our high-volume, direct-to-consumer accounting and financial data warehouse. The ideal candidate will have a strong background in data analysis, SQL, and data transformation, with experience in financial data warehousing and reporting. This role will involve working closely with finance and accounting teams to gather requirements, build dashboards, and transform data to support month-end accounting, tax reporting, and financial forecasting. The financial data warehouse is currently built in *Snowflake* and will be migrated to *Databricks*. The candidate will be responsible for transitioning reporting and transformation processes to Databricks while ensuring data accuracy and consistency. *Key Responsibilities:* 1. *Data Analysis & Reporting:* - Build and maintain *month-end accounting and tax dashboards* using SQL and Snowsight in Snowflake. - Transition reporting processes to *Databricks*, creating dashboards and reports to support finance and accounting teams. - Gather requirements from finance and accounting stakeholders to design and deliver actionable insights. 2. *Data Transformation & Aggregation:* - Develop and implement data transformation pipelines in *Databricks* to aggregate financial data and create *balance sheet look-forward views*. - Ensure data accuracy and consistency during the migration from Snowflake to Databricks. - Collaborate with the data engineering team to optimize data ingestion and transformation processes. 3. *Data Integration & ERP Collaboration:* - Support the integration of financial data from the data warehouse into *NetSuite ERP* by ensuring data is properly transformed and validated. - Work with cross-functional teams to ensure seamless data flow between systems. 4. *Data Ingestion & Tools:* - Understand and work with *Fivetran* for data ingestion (no need to be an expert, but familiarity is required). - Troubleshoot and resolve data-related issues in collaboration with the data engineering team. Additional Qualifications: - 3+ years of experience as a *Data Analyst* or similar role, preferably in a financial or accounting context. - Strong proficiency in *SQL* and experience with *Snowflake* and *Databricks*. - Experience building dashboards and reports for financial data (e.g., month-end close, tax reporting, balance sheets). - Familiarity with *Fivetran* or similar data ingestion tools. - Understanding of financial data concepts (e.g., general ledger, journals, balance sheets, income statements). - Experience with data transformation and aggregation in a cloud-based environment. - Strong communication skills to collaborate with finance and accounting teams. - Nice-to-have: Experience with *NetSuite ERP* or similar financial systems.

Posted 2 months ago

Apply

2.0 - 7.0 years

0 - 1 Lacs

Mumbai Suburban, Navi Mumbai, Mumbai (All Areas)

Work from Office

Job Opening: ODI Developer | Mumbai Experience: 24 Years Location: Mumbai (Full-time) Notice Period: 0–15 Days Budget: Will discuss over the call Client: Will discuss over the call Key Responsibilities: -Design & develop medium-to-complex ODI interfaces using mappings, packages, and knowledge modules -Work on end-to-end ETL development, from topology setup to load plans and job scheduling -Support and troubleshoot production issues and optimize ODI job performance -Collaborate with Oracle DB, SAP HANA, CSV/XML sources, and manage data integration processes -Customize Knowledge Modules (KM) — IKM, LKM, CKM Implement SCD Type 1 and Type 2 mappings -Manage and deploy ODI Agents across environments -Participate in Agile practices and team ceremonies Must-Have Skills: -2–4 years of experience in ODI 12c and ETL development -Strong Oracle SQL knowledge and debugging experience -Proficiency in managing ODI mappings, packages, procedures, and schedules -Understanding of data warehousing, performance tuning & integration best practices -Familiarity with scripting languages like Groovy or Python Good-to-Have: -Experience with Fivetran or DBT -Knowledge of SAP HANA data extraction techniques -Experience working in Agile teams Why Join Us? -Work on cutting-edge data solutions for a global client -Collaborative team culture with opportunities for technical growth -Fast-paced, impact-driven role within enterprise IT Ready to join? Apply now or DM for more details to anzia.sabreen@bct-consulting.com

Posted 2 months ago

Apply

2.0 years

10 Lacs

Chennai

Remote

Seeking a Sigma Developer to build dashboards, optimize SQL, integrate with JS frameworks, connect to cloud warehouses, ensure BI security, and support CI/CD. Must excel in Sigma, data modeling, and cross-team collaboration for data-driven insights. Required Candidate profile Bachelor’s in CS/Data field, 2+ yrs in Sigma/BI tools, SQL expert, experience with embedding, cloud warehouses (Snowflake/BigQuery), data modeling, BI security, and building responsive dashboards

Posted 2 months ago

Apply

7.0 - 10.0 years

20 - 25 Lacs

Pune

Work from Office

Role & Responsibilities Architect, build, and tune Snowflake data warehouses and ELT pipelines (SQL, Streams, Tasks, UDFs, Stored Procedures) to meet complex commercial-analytics workloads. Integrate diverse pharma data sources (Veeva, Salesforce, IQVIA, Symphony, RWD, patient-services feeds) via Fivetran, ADF, or Python-based frameworks, ensuring end-to-end data quality. Establish robust data models (star, snowflake, Data Vault) optimized for sales reporting, market-share analytics, and AI/ML use-cases. Drive governance & compliance (HIPAA, GDPR, GxP) through fine-grained access controls, masking, lineage, and metadata management. Lead code reviews, mentor engineers, and resolve performance bottlenecks while right-sizing compute for cost efficiency. Partner with business stakeholders to translate commercial objectives into scalable data solutions and actionable insights. Skills & Qualifications Must-Have 7+ yrs data-engineering / warehousing experience, incl. 4+ yrs hands-on Snowflake design & development. Expertlevel SQL plus strong data-modeling (Dimensional, Data Vault) and ETL/ELT optimisation skills. Proficiency in Python (or similar) for automation, API integrations, and orchestration. Proven governance/security acumen within regulated industries (HIPAA, GDPR, PII). Bachelors in Computer Science, Engineering, Information Systems (Masters preferred). Strong client-facing communication and problem-solving ability in fast-paced, agile environments. Preferred Direct experience with pharma commercial datasets (sales, CRM, claims, MDM, adherence KPIs). Cloud-platform depth (AWS, Azure, or GCP) and familiarity with tools such as Matillion/DBT/Airflow, Git. Snowflake certifications (SnowPro Core / Advanced) plus Tableau, Power BI, or Qlik connectivity know-how.

Posted 2 months ago

Apply

6.0 - 11.0 years

13 - 23 Lacs

Bengaluru

Work from Office

Greetings for the Day!!! Scouting for a BI Engineer to be associated with IT Service based (SaaS) organization. Designation: BI Architect Location: Bangalore. Mode: Hybrid Shift: 1 PM to 10 PM Role & responsibilities: Engineer Self-Service BI Solutions: Design and implement robust and intuitive Power BI models that empower superb reporting and strategic decisions. Data warehouse Development and Integration: Leverage Snowflake and DBT to design and develop data warehouse tables, ensuring they are seamlessly incorporated into the Power BI ecosystem for comprehensive reporting. Simplify Complexity: Turn complex data and process into intuitive and actionable assets Design solutions and processes that enable data engineers, BI, and analysts to accelerate delivery of high-quality data assets Performance Optimization: Ensure optimal performance of Power BI solutions Stakeholder Collaboration: Deliver consistently on internal partner requests Mentorship and Support: Mentor coworkers and empower business partners to build their own reports using our Power BI and Snowflake models. Continuous Improvement: Stay on the cutting edge of tools and tech to continuously enhance our Power BI/Snowflake capabilities Technical Documentation: For internal development alignment and stakeholder enablement Additional Qualifications: Power BI Version Control using Azure DevOps or GIT for scalable development Analytical Thinking: Strong analytical skills to interpret data and model for actionable insights. SaaS Experience: Preferred experience with subscription data in a SaaS environment. Python Experience: and FIvetran . Preferred to automate processes and tap into the Power BI API. Interested candidates kindly share your updated resume to james.lobo@mappyresources.com

Posted 2 months ago

Apply

0.0 years

0 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients . Powered by our purpose - the relentless pursuit of a world that works better for people - we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Lead Consultant - ML Engineer ! In this role, we are looking for candidates who have relevant years of experienc e in d esigning and developing machine learning and deep learning system . Who have professional software development experience . Hands on r unning machine learning tests and experiments . Implementing appropriate ML algorithms engineers. Responsibilities Drive the vision for modern data and analytics platform to deliver well architected and engineered data and analytics products leveraging cloud tech stack and third-party products Close the gap between ML research and production to create ground-breaking new products, features and solve problems for our customers Design, develop, test, and deploy data pipelines, machine learning infrastructure and client-facing products and services Build and implement machine learning models and prototype solutions for proof-of-concept Scale existing ML models into production on a variety of cloud platforms Analyze and resolve architectural problems, working closely with engineering, data science and operations teams Qualifications we seek in you! Minimum Q ualifications / Skills Good years experience B achelor%27s degree in computer science engineering, information technology or BSc in Computer Science, Mathematics or similar field Master&rsquos degree is a plus Integration - APIs, micro- services and ETL/ELT patterns DevOps (Good to have) - Ansible, Jenkins, ELK Containerization - Docker, Kubernetes etc Orchestration - Google composer Languages and scripting: Python, Scala Java etc Cloud Services - GCP, Snowflake Analytics and ML tooling - Sagemaker , ML Studio Execution Paradigm - low latency/Streaming, batch Preferred Q ualifications / Skills Data platforms - DBT, Fivetran and Data Warehouse (Teradata, Redshift, BigQuery , Snowflake etc.) Visualization Tools - PowerBI , Tableau Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Get to know us at and on , , , and . Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training .

Posted 3 months ago

Apply

0.0 years

0 Lacs

Hyderabad / Secunderabad, Telangana, Telangana, India

On-site

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients . Powered by our purpose - the relentless pursuit of a world that works better for people - we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Lead Consultant - ML Engineer ! In this role, we are looking for candidates who have relevant years of experienc e in d esigning and developing machine learning and deep learning system . Who have professional software development experience . Hands on r unning machine learning tests and experiments . Implementing appropriate ML algorithms engineers. Responsibilities Drive the vision for modern data and analytics platform to deliver well architected and engineered data and analytics products leveraging cloud tech stack and third-party products Close the gap between ML research and production to create ground-breaking new products, features and solve problems for our customers Design, develop, test, and deploy data pipelines, machine learning infrastructure and client-facing products and services Build and implement machine learning models and prototype solutions for proof-of-concept Scale existing ML models into production on a variety of cloud platforms Analyze and resolve architectural problems, working closely with engineering, data science and operations teams Qualifications we seek in you! Minimum Q ualifications / Skills Good years experience B achelor%27s degree in computer science engineering, information technology or BSc in Computer Science, Mathematics or similar field Master&rsquos degree is a plus Integration - APIs, micro- services and ETL/ELT patterns DevOps (Good to have) - Ansible, Jenkins, ELK Containerization - Docker, Kubernetes etc Orchestration - Google composer Languages and scripting: Python, Scala Java etc Cloud Services - GCP, Snowflake Analytics and ML tooling - Sagemaker , ML Studio Execution Paradigm - low latency/Streaming, batch Preferred Q ualifications / Skills Data platforms - DBT, Fivetran and Data Warehouse (Teradata, Redshift, BigQuery , Snowflake etc.) Visualization Tools - PowerBI , Tableau Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Get to know us at and on , , , and . Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training .

Posted 3 months ago

Apply

5.0 - 10.0 years

17 - 30 Lacs

Hyderabad

Remote

At Mitratech, we are a team of technocrats focused on building world-class products that simplify operations in the Legal, Risk, Compliance, and HR functions of Fortune 100 companies. We are a close-knit, globally dispersed team that thrives in an ecosystem that supports individual excellence and takes pride in its diverse and inclusive work culture centered around great people practices, learning opportunities, and having fun! Our culture is the ideal blend of entrepreneurial spirit and enterprise investment, enabling the chance to move at a rapid pace with some of the most complex, leading-edge technologies available. Given our continued growth, we always have room for more intellect, energy, and enthusiasm - join our global team and see why it's so special to be a part of Mitratech! Job Description We are seeking a highly motivated and skilled Analytics Engineer to join our dynamic data team. The ideal candidate will possess a strong background in data engineering and analytics, with hands-on experience in modern analytics tools such as Airbyte, Fivetran, dbt, Snowflake, Airflow, etc. This role will be pivotal in transforming raw data into valuable insights, ensuring data integrity, and optimizing our data infrastructure to support the organization's data platform. Essential Duties & Responsibilities Data Integration and ETL Processes: Design, implement, and manage ETL pipelines using tools like Airbyte and Fivetran to ensure efficient and accurate data flow from various sources into our Snowflake data warehouse. Maintain and optimize existing data integration workflows to improve performance and scalability. Data Modeling and Transformation: Develop and maintain data models using dbt / dbt Cloud to transform raw data into structured, high-quality datasets that meet business requirements. Ensure data consistency and integrity across various datasets and implement data quality checks. Data Warehousing: Manage and optimize our Redshift / Snowflake data warehouses, ensuring it meets performance, storage, and security requirements. Implement best practices for data warehouse management, including partitioning, clustering, and indexing. Collaboration and Communication: Work closely with data analysts, data scientists, and business stakeholders to understand data requirements and deliver solutions that meet their needs. Communicate complex technical concepts to non-technical stakeholders in a clear and concise manner. Continuous Improvement: Stay updated with the latest developments in data engineering and analytics tools, and evaluate their potential to enhance our data infrastructure. Identify and implement opportunities for process improvements, automation, and optimization within the data pipeline. Requirements & Skills: Education and Experience: Bachelor's degree in Computer Science, Information Systems, Data Science, or a related field. 3-5 years of experience in data engineering or analytics engineering roles. Experience in AWS and DevOps is a plus. Technical Skills: Proficiency with modern ETL tools such as Airbyte and Fivetran. Must have experience with dbt for data modeling and transformation. Extensive experience working with Snowflake or similar cloud data warehouses. Solid understanding of SQL and experience writing complex queries for data extraction and manipulation. Familiarity with Python or other programming languages used for data engineering tasks. Analytical Skills: Strong problem-solving skills and the ability to troubleshoot data-related issues. Ability to understand business requirements and translate them into technical specifications. Soft Skills: Excellent communication and collaboration skills. Strong organizational skills and the ability to manage multiple projects simultaneously. Detail-oriented with a focus on data quality and accuracy. We are an equal-opportunity employer that values diversity at all levels. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, national origin, age, sexual orientation, gender identity, disability, or veteran status.

Posted 3 months ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Mumbai, Hyderabad, Bengaluru

Hybrid

Your day at NTT DATA The Software Applications Development Engineer is a seasoned subject matter expert, responsible for developing new applications and improving upon existing applications based on the needs of the internal organization and or external clients. What you'll be doing Yrs. Of Exp: 5 Yrs. Data Engineer- Work closely with Lead Data Engineer to understand business requirements, analyse and translate these requirements into technical specifications and solution design. Work closely with Data modeller to ensure data models support the solution design Develop , test and fix ETL code using Snowflake, Fivetran, SQL, Stored proc. Analysis of the data and ETL for defects/service tickets (for solution in production ) raised and service tickets. Develop documentation and artefacts to support projects.

Posted 3 months ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Bengaluru

Work from Office

Okta is looking for a Sr. Marketing Data Operations Analyst to join the Marketing Data Operations & Technology team. Reporting into the Sr. Manager, Marketing Technology this new role will support the management and optimisation of Okta s marketing data across our core marketing technology stack.Okta has a large marketing technology and data estate spanning an audience of millions which includes inputs from a range of systems, including Okta s CRM system (Salesforce), marketing automation platform (Adobe Marketo Engage), and connected infrastructure which includes tools spanning sales outreach (Outreach), ABM (6Sense, Folloze), and data enrichment (Clay, Clearbit).The Sr. Marketing Data Operations Analyst will contribute to a number of critical areas supporting Okta s drive towards operational excellence across its marketing technology estate. This includes driving overall database health and improving data quality, the management of integrations in the data operations function and the conducting of ongoing data maintenance as well as processes to support these efforts.The role is integral to delivering a program of technical efficiency, operational excellence and a supporting framework of data-driven insights from within the Marketing Data Operations & Technology team. This role requires strong analytical skills, attention to detail and the ability to collaborate with cross functional teams. As such the successful candidate will be able to demonstrate a data-driven marketing mindset and have demonstrable experience of working within a data operations function or role to support and drive marketing performance. Job Duties And Responsibilities : Manage and drive data cleansing initiatives and enrichment processes to improve data accuracy and completeness. Administer and maintain key data enrichment marketing technology tools, with a focus on Clay and Clearbit, ensuring optimal configuration and utilization. Partner closely with key Marketing stakeholders to create and manage new use cases and workflows within our data enrichment tools - creating business requirement docs, technical architectural flows, and monitoring/measuring business impact. Partner closely with the greater Marketing Operations team to manage data updates, maintenance, and logic within 6sense to support effective ABM strategies. Identifying data gaps, discrepancies and issues and, where appropriate, owning the design and implementation of processes to improve these issues. Assist with manual data load fixes across various platforms (Salesforce, Marketo, etc), ensuring data integrity and resolving data discrepancies. Provide miscellaneous data operations tool support and fulfill tool provisioning requests, ensuring users have the necessary access and functionality. Drive and collaborate on the creation of a Marketing Data Ops Data Dictionary, ensuring data governance and clarity across tools and systems. Skills & Experience: Required3+ years of experience working in a data operations function or role supporting go-to-market teams RequiredExperience of working with Salesforce (preference for candidates who have worked directly with systems integrations with Salesforce. Salesforce certifications are a plus). Candidates should be comfortable with the core Salesforce Object models RequiredExperience of working with business stakeholders to understand existing workflows, business requirements and translate this into solution design and delivery. RequiredStrong critical thinker and problem solver, with an eye for detail PreferredKnowledge of SQL (for analytics) and comfortable querying data-warehouses for analytical purposes (such as SnowFlake) PreferredCandidates with experience of integrating with analytics and data orchestrations platforms (Openprise, FiveTran, Tableau, Datorama, Google Data Studio/Looker Studio). Preferred Candidates with exposure to a range of Marketing Technology applicationsFor example: Sales outreach platformsSuch as Outreach / SalesLoft. ABM platformsSuch as 6Sense / Folloze. Optimization/ Personalization platformssuch as Intellimize / Optimizely Data Enrichment ToolsSuch as Leadspace/ Clay / ZoomInfo / Clearbit This role requires in-person onboarding and travel to our Bengaluru, IN office during the first week of employment.

Posted 3 months ago

Apply

5.0 - 10.0 years

25 - 35 Lacs

Chennai

Work from Office

Job Summary: We are seeking an experienced Manager - Data Engineer to join our dynamic team. In this role, you will be responsible for designing, implementing, and maintaining data infrastructure on Azure with extensive focus on Azure Databricks. You will work hand in hand with our analytics team to support data-driven decision making across different external clients in a variety of industries. SCOPE OF WORK Design, build, and maintain scalable data pipelines using Azure Data Factory (ADF), Fivetran and other Azure services. Administer, monitor, and troubleshoot SQL Server databases, ensuring high performance and availability. Develop and optimize SQL queries and stored procedures to support data transformation and retrieval. Implement and maintain data storage solutions in Azure, including Azure Databricks, Azure SQL Database, Azure Blob Storage, and Data Lakes. Collaborate with business analysts, clients and stakeholders to deliver insightful reports and dashboards using Power BI. Develop scripts to automate data processing tasks using languages such as Python, PowerShell, or similar. Ensure data security and compliance with industry standards and organizational policies. Stay updated with the latest technologies and trends in Azure cloud services and data engineering. Desired experience in healthcare data analytics, including familiarity with healthcare data models such as Encounter based models, or Claims focused models or Manufacturing data analytics or Utility Analytics IDEAL CANDIDATE PROFILE Bachelors degree in Computer Science, Engineering, Information Technology, or related field. At least 5-8 years of experience in data engineering with a strong focus on Microsoft Azure and Azure Databricks. Proven expertise in SQL Server database administration and development. Experience in building and optimizing data pipelines, architectures, and data sets on Azure. Experience with dbt and Fivetran Familiarity with Azure AI and LLM’s including Azure OpenAI Proficiency in Power BI for creating reports and dashboards. Strong scripting skills in Python, PowerShell, or other relevant languages. Familiarity with other Azure data services (e.g., Azure Synapse Analytics, Azure Blob..etc). Knowledge of data modeling, ETL processes, and data warehousing concepts. Excellent problem-solving skills and the ability to work independently or as part of a team. Strong communication and interpersonal skills to collaborate effectively with various teams and understand business requirements. Certifications in Azure Data Engineering or related fields. Experience with machine learning and data science projects (huge plus). Knowledge of additional BI tools and data integration platforms

Posted 3 months ago

Apply

7.0 - 12.0 years

25 - 30 Lacs

Coimbatore

Remote

Role & responsibilities SUMMARY: Data Engineer will be responsible for ETL and documentation in building data warehouse and analytics capabilities. Additionally, maintain existing systems/processes and develop new features, along with reviewing, presenting and implementing performance improvements. Duties and Responsibilities: Build ETL (extract, transform, and loading) jobs using Fivetran and dbt for our internal projects and for customers that use various platforms like Azure, Salesforce and AWS technologies Monitoring active ETL jobs in production. Build out data lineage artifacts to ensure all current and future systems are properly documented Assist with the build out design/mapping documentation to ensure development is clear and testable for QA and UAT purposes Assess current and future data transformation needs to recommend, develop, and train new data integration tool technologies Discover efficiencies with shared data processes and batch schedules to help ensure no redundancy and smooth operations Assist the Data Quality Analyst to implement checks and balances across all jobs to ensure data quality throughout the entire environment for current and future batch jobs. Hands-on experience in developing and implementing large-scale data warehouses, Business Intelligence and MDM solutions, including Data Lakes/Data Vaults. Required Skills This job has no supervisory responsibilities. Bachelor's Degree in Computer Science, Math, Software Engineering, Computer Engineering, or related field AND 6+ years experience in business analytics, data science, software development, data modeling or data engineering work 5+ years experience with a strong proficiency with SQL query/development skills Develop ETL routines that manipulate and transfer large volumes of data and perform quality checks Hands-on experience with ETL tools (e.g Informatica, Talend, dbt, Azure Data Factory) Experience working in the healthcare industry with PHI/PII Creative, lateral, and critical thinker Excellent communicator Well-developed interpersonal skills Good at prioritizing tasks and time management Ability to describe, create and implement new solutions Experience with related or complementary open source software platforms and languages (e.g. Java, Linux, Apache, Perl/Python/PHP, Chef) Knowledge / Hands-on experience with BI tools and reporting software (e.g. Cognos, Power BI, Tableau)

Posted 3 months ago

Apply

7.0 - 12.0 years

15 - 30 Lacs

Hyderabad

Remote

Lead Data Engineer with Health Care Domain Role & responsibilities Position: Lead Data Engineer Experience: 7+ Years Location: Hyderabad | Chennai | Remote SUMMARY: Data Engineer will be responsible for ETL and documentation in building data warehouse and analytics capabilities. Additionally, maintain existing systems/processes and develop new features, along with reviewing, presenting and implementing performance improvements. Duties and Responsibilities Build ETL (extract, transform, and loading) jobs using Fivetran and dbt for our internal projects and for customers that use various platforms like Azure, Salesforce, and AWS technologies. Monitoring active ETL jobs in production. Build out data lineage artifacts to ensure all current and future systems are properly documented. Assist with the build out design/mapping documentation to ensure development is clear and testable for QA and UAT purposes. Assess current and future data transformation needs to recommend, develop, and train new data integration tool technologies. Discover efficiencies with shared data processes and batch schedules to help ensure no redundancy and smooth operations Assist the Data Quality Analyst to implement checks and balances across all jobs to ensure data quality throughout the entire environment for current and future batch jobs. Hands-on experience in developing and implementing large-scale data warehouses, Business Intelligence and MDM solutions, including Data Lakes/Data Vaults . Required Skills This job has no supervisory responsibilities. Need strong experience with Snowflake and Azure Data Factory(ADF). Bachelor's Degree in Computer Science, Math, Software Engineering, Computer Engineering, or related field AND 6+ years experience in business analytics, data science, software development, data modeling or data engineering work. 5+ years experience with a strong proficiency with SQL query/development skills Develop ETL routines that manipulate and transfer large volumes of data and perform quality checks Hands-on experience with ETL tools (e.g Informatica, Talend, dbt, Azure Data Factory) Experience working in the healthcare industry with PHI/PII Creative, lateral, and critical thinker Excellent communicator Well-developed interpersonal skills Good at priori zing tasks and time management Ability to describe, create and implement new solutions Experience with related or complementary open source so ware platforms and languages (e.g. Java, Linux, Apache, Perl/Python/PHP, Chef) Knowledge / Hands-on experience with BI tools and reporting software (e.g. Cognos, Power BI, Tableau) Big Data stack (e.g. Snowflake(Snowpark), SPARK, MapReduce, Hadoop, Sqoop, Pig, HBase, Hive, Flume)

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies