Posted:2 months ago| Platform:
Hybrid
Full Time
Role & responsibilities Understand the Business Problem and the Relevant Data • Maintain an intimate understanding of company and department strategy • Translate analysis requirements into data requirements • Identify and understand the data sources that are relevant to the business problem • Develop conceptual models that capture the relationships within the data • Define the data-quality objectives for the solution • Be a subject matter expert in data sources and reporting options Architect Data Management Systems • Leverage understanding of the business problem and the nature of the data to select appropriate data management system (Big Data, OLTP, OLAP, etc.) • Design and implement optimum data structures in the appropriate data management system (Hadoop, Teradata, SQL Server, etc.) to satisfy the data requirements • Plan methods for archiving/deletion of information Develop, Automate, and Orchestrate an Ecosystem of ETL Processes for Varying Volumes of Data • Identify and select the optimum methods of access for each data source (real-time/streaming, delayed, static) • Determine transformation requirements and develop processes to bring structured and unstructured data from the source to a new physical data model • Develop processes to efficiently load the transform data into the data management system Prepare Data to Meet Analysis Requirements • Work with the data scientist to implement strategies for cleaning and preparing data for analysis (e.g., outliers, missing data, etc.) • Develop and code data extracts • Follow best practices to ensure data quality and data integrity • Ensure that the data is fit to use for data science applications Preferred candidate profile 5+ years developing, delivering, and/or supporting data engineering, advanced analytics or business intelligence solutions • Ability to work with multiple operating systems (e.g., MS Office, Unix, Linux, etc.) • Experienced in developing ETL/ELT processes using Apache Ni-Fi and Snowflake • Experienced in Cloud based solutions using AWS/AZURE/GCP. • Significant experience with big data processing and/or developing applications and data sources via Spark, etc. • Understanding of how distributed systems work • Familiarity with software architecture (data structures, data schemas, etc.) • Strong working knowledge of databases (Oracle, MSSQL, etc.) including SQL and NoSQL. • Strong mathematics background, analytical, problem solving, and organizational skills • Strong communication skills (written, verbal and presentation) • Experience working in a global, cross-functional environment • Minimum of 2 years experience in any of the following: At least one high-level client, object-oriented language (e.g., C#, C++, JAVA, Python, Perl, etc.); at least one or more web programming language (PHP, MySQL, Python, Perl, JavaScript, ASP, etc.); one or more Data Extraction Tools (SSIS, Informatica etc.) • Software development Ability to travel as needed
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Hyderabad
16.0 - 31.0 Lacs P.A.