Job Description As an advertising technology focused software engineer at Octillion Media, you will be responsible for designing, implementing, and managing end-to-end data pipelines to ensure easy accessibility of data for analysis. Your role will involve integrating with third-party APIs for accessing external data, creating and maintaining data warehouses for reporting and analysis purposes, and collaborating with engineering and product teams to execute data-related product initiatives. You will also be tasked with evaluating existing tools/solutions for new use cases and building new ones if necessary. Your willingness to take end-to-end ownership and be accountable for the product's success will be crucial in this role. You should have a minimum of 3 years of experience in a Data Engineering role and possess the ability to write clean and structured code in SQL, bash scripts, and Python (or similar languages). A solid understanding of database technologies, experience in building automated, scalable, and robust data processing systems, and familiarity with ETL and data warehouse systems such as Athena/Bigquery are essential qualifications for this position. Additionally, experience in working with large scale quantitative data using technologies like Spark, as well as the ability to quickly resolve performance and system incidents, will be advantageous. Having experience with Big Data/ML and familiarity with RTB, Google IMA SDK, VAST, VPAID, and Header Bidding will be considered a plus. Previous experience in product companies would also be beneficial for this role. If you are looking to join a dynamic team at Octillion Media and contribute to cutting-edge advertising technology solutions, we encourage you to apply. Your information will be handled confidentially in accordance with EEO guidelines.,
Role Overview: As an advertising technology focused software engineer at Octillion Media, your primary responsibility will be designing, implementing, and managing end-to-end data pipelines to ensure easy accessibility of data for analysis. You will integrate with third-party APIs for accessing external data, maintain data warehouses for reporting and analysis, and collaborate with engineering and product teams for executing data-related product initiatives. Your role will also involve evaluating existing tools/solutions for new use cases and developing new solutions when necessary. Your commitment to taking end-to-end ownership and accountability for the product's success will be critical in this position. Key Responsibilities: - Design, implement, and manage end-to-end data pipelines for easy data accessibility - Integrate with third-party APIs for accessing external data - Maintain data warehouses for reporting and analysis purposes - Collaborate with engineering and product teams to execute data-related product initiatives - Evaluate existing tools/solutions for new use cases and develop new ones as required - Demonstrate ownership and accountability for the product's success Qualifications Required: - Minimum of 3 years of experience in a Data Engineering role - Proficiency in writing clean and structured code in SQL, bash scripts, and Python (or similar languages) - Solid understanding of database technologies - Experience in building automated, scalable, and robust data processing systems - Familiarity with ETL and data warehouse systems such as Athena/Bigquery - Experience working with large scale quantitative data using technologies like Spark - Ability to quickly resolve performance and system incidents - Experience with Big Data/ML and familiarity with RTB, Google IMA SDK, VAST, VPAID, and Header Bidding considered a plus - Previous experience in product companies would be beneficial Additional Company Details: Omit this section as no additional details of the company are provided in the job description.,