Job
Description
Your Job
As a Data Engineer you will be a part of an team that designs, develops, and delivers Data Pipelines and Data Analytics Solutions for Koch Industries. Koch Industries is a privately held global organization with over 120,000 employees around the world, with subsidiaries involved in manufacturing, trading, and investments. Koch Global Solution India (KGSI) is being developed in India to extend its IT operations, as well as act as a hub for innovation in the IT function. As KSGI rapidly scales up its operations in India, its employees will get opportunities to carve out a career path for themselves within the organization. This role will have the opportunity to join on the ground floor and will play a critical part in helping build out the Koch Global Solution (KGS) over the next several years. Working closely with global colleagues would provide significant international exposure to the employees.
Our Team
The Enterprise data and analytics team at Georgia Pacific is focused on creating an enterprise capability around Data Engineering Solutions for operational and commercial data as well as helping businesses develop, deploy, manage monitor Data Pipelines and Analytics solutions of manufacturing, operations, supply chain and other key areas.
What You Will Do
ETL SolutionsDesign, implement, and manage large-scale ETL solutions using the AWS technology stack, including Event Bridge, Lambda, Glue, Step Functions, Redshift, and CloudWatch.Data Pipeline ManagementDesign, develop, enhance, and debug existing data pipelines to ensure seamless operations.Data ModellingProven Experience in Designing, Developing Data Modeling.Best Practices ImplementationDevelop and implement best practices to ensure high data availability, computational efficiency, cost-effectiveness, and data quality within Snowflake and AWS environments.EnhancementBuild and enhance Data Products, processes, functionalities, and tools to streamline all stages of data lake implementation and analytics solution development, including proof of concepts, prototypes, and production systems.Production SupportProvide ongoing support for production data pipelines, ensuring high availability and performance.Issue ResolutionMonitor, troubleshoot, and resolve issues within data pipelines and ETL processes promptly.AutomationDevelop and implement scripts and tools to automate routine tasks and enhance system efficiency Who You Are (Basic Qualifications)Bachelor's degree in Computer Science, Engineering, or a related IT field, with at least 7+ years of experience in software development.5+ Years of hands-on experience of Designing, implementing, and managing large-scale ETL solutions using the AWS technology stack including Event Bridge, Lambda, Glue, Step Functions, Redshift, and CloudWatch.Primary skill setSQL, S3, AWS Glue, Pyspark, Python, Lambda, Columnar DB (Redshift), AWS IAM, Step Functions, Git, Terraform, CI/CD.Good to haveExperience with the MSBI stack, including SSIS, SSAS, and SSRS. What Will Put You AheadIn-depth knowledge of entire suite of services in AWS Data Service Platform.Strong coding experience using Python, Pyspark.Experience of designing and implementing Data Modeling.Cloud Data Analytics/Engineering certification.Who We AreAt Koch, employees are empowered to do what they do best to make life better. Learn how ourhelps employees unleash their potential while creating value for themselves and the company.Additionally, everyone has individual work and personal needs. We seek to enable the best work environment that helps you and the business work together to produce superior results.