Home
Jobs

1 Olap Technologies Jobs

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5 - 10 years

7 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Your Job The Data Engineer will be a part of an global team that designs, develops, and delivers BI and Analytics solutions leveraging the latest BI and Analytics technologies for Koch Industries. Koch Industries is a privately held global organization with over 120,000 employees around the world, with subsidiaries involved in manufacturing, trading, and investments. Koch Technology Center (KTC) is being developed in India to extend its IT operations, as well as act as a hub for innovation in the IT function. As KTC rapidly scales up its operations in India, its employees will get opportunities to carve out a career path for themselves within the organization. This role will have the opportunity to join on the ground floor and will play a critical part in helping build out the Koch Technology Center (KTC) over the next several years. Working closely with global colleagues would provide significant global exposure to the employees. This role is a part of the Georgia Pacific team within the KTC. Our Team The Data Engineer will report to the Data Engineering & BI Lead of the KGS and will be responsible to develop and implement a future-state data analytics platform for both the back-end data processing and the front-end data visualization component for the Finance Data Delivery teams. This individual will be a hands on role to build ingestion pipelines and the data warehouse. What You Will Do Be part of the data team to design and build a BI and analytics solution Implement batch and near real time data movement design patterns and define best practices in data engineering. Design and develop optimal cloud data solutions (lakes, warehouses, marts, analytics) by collaborating with diverse IT teams including business analysts, project managers, architects, and developers. Work closely with a team of data engineers and data analysts to procure, blend and analyze data for quality and distribution; ensuring key elements are harmonized and modeled for effective analytics, while operating in a fluid, rapidly changing data environment Build data pipelines from a wide variety of sources Demonstrate strong conceptual, analytical, and problem-solving skills and ability to articulate ideas and technical solutions effectively to external IT partners as well as internal data team members Work with cross-functional teams, on-shore/off-shore, development/QA teams/Vendors in a matrixed environment for data delivery Backtracking, troubleshooting failures and provide fix as needed Update and maintain key data cloud solution deliverables and diagrams Ensure conformance and compliance using Georgia-Pacific data architecture guidelines and enterprise data strategic vision Who You Are (Basic Qualifications) Bachelors degree in computer science, Engineering, or related IT area with at least 5-8 years of experience in software development. Primary Skill set: SQL, PYTHON, Columnar DB( Snowflake) Secondary Skill set: Docker, Kubernetes, CI/CD At least 5+ of hands-on experience in designing, implementing, managing large-scale and ETL solutions. At least 3 years of hands-on experience in business intelligence, data modelling, data engineering, ETL, multi-dimensional data warehouses, cubes, with expertise in relevant languages and frameworks like SQL, Python etc. Hands on experience with designing and fine-tuning queries in Columnar DB(Redshift or Snowflake) Strong knowledge of Data Engineering, Data Warehousing, OLAP and database concepts Understanding of common DevSecOps/DataOps and CICD processes, methodologies, and technologies like GitLab, Terraform etc Be able to analyze large complex data sets to resolve data quality issues What Puts You Ahead: AWS certifications like Solution Architect (SAA/SAP) or Data Engineering Associate (DEA) Hands-on experience with AWS data technologies and at least one full life cycle project experience in building a data solution in AWS. Exposure to visualization tools, such as Tableau or PowerBI. Experience with OLAP technologies and Data Virtualization (using Denodo) Knowledge on manufacturing and Finance domain

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies