Posted:1 week ago| Platform:
Remote
Contractual
Job Type: Contract Location: Remote Experience: 7+ yrs Job Description: · Build ETL (extract, transform, and loading) jobs using Fivetran and dbt for our internal projects and for customers that use various platforms like Azure, Salesforce, and AWS technologies · Monitoring active ETL jobs in production. · Build out data lineage artifacts to ensure all current and future systems are properly documented · Assist with the build out design/mapping documentation to ensure development is clear and testable for QA and UAT purposes · Assess current and future data transformation needs to recommend, develop, and train new data integration tool technologies · Discover efficiencies with shared data processes and batch schedules to help ensure no redundancy and smooth operations · Assist the Data Quality Analyst to implement checks and balances across all jobs to ensure data quality throughout the entire environment for current and future batch jobs. · Hands-on experience in developing and implementing large-scale data warehouses, Business Intelligence and MDM solutions, including Data Lakes/Data Vaults. Required Skills: · This job has no supervisory responsibilities. · Need strong experience with Snowflake and Azure Data Factory(ADF). · Bachelor's Degree in Computer Science, Math, Software Engineering, Computer Engineering, or related field AND 6+ years’ experience in business analytics, data science, software development, data modeling or data engineering work · 5+ years’ experience with a strong proficiency with SQL query/development skills · Develop ETL routines that manipulate and transfer large volumes of data and perform quality checks · Hands-on experience with ETL tools (e.g Informatica, Talend, dbt, Azure Data Factory) · Experience working in the healthcare industry with PHI/PII · Creative, lateral, and critical thinker · Excellent communicator · Well-developed interpersonal skills · Good at prioritizing tasks and time management · Ability to describe, create and implement new solutions · Experience with related or complementary open source software platforms and languages (e.g. Java, Linux, Apache, Perl/Python/PHP, Chef) · Knowledge / Hands-on experience with BI tools and reporting software (e.g. Cognos, Power BI, Tableau) · Big Data stack (e.g. Snowflake(Snowpark), SPARK, MapReduce, Hadoop, Sqoop, Pig, HBase, Hive, Flume). Show more Show less
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
0.0 - 0.0 Lacs P.A.