Posted:3 months ago|
Platform:
Work from Office
Full Time
What You Will Do Be part of the data team to design and build a BI and analytics solution Implement batch and near real time data movement design patterns and define best practices in data engineering. Design and develop optimal cloud data solutions (lakes, warehouses, marts, analytics) by collaborating with diverse IT teams including business analysts, project managers, architects, and developers. Work closely with a team of data engineers and data analysts to procure, blend and analyze data for quality and distribution; ensuring key elements are harmonized and modeled for effective analytics, while operating in a fluid, rapidly changing data environment Build data pipelines from a wide variety of sources Demonstrate strong conceptual, analytical, and problem-solving skills and ability to articulate ideas and technical solutions effectively to external IT partners as well as internal data team members Work with cross-functional teams, on-shore/off-shore, development/QA teams/Vendors in a matrixed environment for data delivery Backtracking, troubleshooting failures and provide fix as needed Update and maintain key data cloud solution deliverables and diagrams Ensure conformance and compliance using Georgia-Pacific data architecture guidelines and enterprise data strategic vision May participate in a 24x7 on call rotation once the development is complete Who You Are (Basic Qualifications) Bachelors degree in Computer Science, Engineering, or related IT area with at least 5+ years of experience in software development. Primary Skill set : Data Engineering, Python (Especially strong in Object oriented Programming concepts), AWS (Glue, Lambda, EventBridge, Step functions and serverless architecture),Columnar DB(Redshift or Snowflake), Matillion (or any ETL tool) Secondary Skill set: Working with APIs, Spark, GIT/CICD, SQL,SPARK,STEP FUNCTIONS At least 2+ of hands-on experience in designing, implementing, managing large-scale and ETL solutions. At least 3 years of hands-on experience in business intelligence, data modelling, data engineering, ETL, multi-dimensional data warehouses, cubes, with expertise in relevant languages and frameworks like SQL, Python etc. Hands on experience with designing and fine-tuning queries in Redshift Strong knowledge of Data Engineering, Data Warehousing, OLAP and database concepts Understanding of common DevSecOps/DataOps and CICD processes, methodologies, and technologies like GitLab, Terraform etc Be able to analyze large complex data sets to resolve data quality issues What Will Put You Ahead AWS certifications like Solution Architect (SAA/SAP) or Data Analytics Specialty (DAS) Hands-on experience with AWS data technologies and at least one full life cycle project experience in building a data solution in AWS. Exposure to visualization tools, such as Tableau or PowerBI. Experience with OLAP technologies and Data Virtualization (using Denodo) Knowledge in Accounting and Finance
Domnic Lewis Private Limited
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
My Connections Domnic Lewis Private Limited
Chennai
25.0 - 30.0 Lacs P.A.
Hyderabad, Pune, Bengaluru
10.0 - 20.0 Lacs P.A.
Chennai
0.5 - 0.6 Lacs P.A.
Hyderabad, Chennai, Bengaluru
9.5 - 15.0 Lacs P.A.
Bengaluru
7.0 - 17.0 Lacs P.A.
Hyderabad
15.0 - 30.0 Lacs P.A.
Pune
15.0 - 30.0 Lacs P.A.
Chennai, Bengaluru
15.0 - 20.0 Lacs P.A.
Hyderabad, Chennai, Bengaluru
10.0 - 19.0 Lacs P.A.
Hyderābād
2.51046 - 7.5 Lacs P.A.