Posted:2 months ago| Platform:
Work from Office
Full Time
Were on the lookout for the brightest, most committed individuals to join us on our mission Along the journey, we ll provide you with a nurturing environment where you can be part of something truly extraordinary and make a real difference for companies and the planet What you ll do: Advanced Execution & Data Management: Oversee and manage intricate project tasks, providing insights and directions related to advanced data ingestion, transformation, validation, and publishing Review and analyse the data provided by the customer along with its technical/functional intent and interdependencies Engage proactively with functional teams, ensuring a thorough understanding of end-toend data flows as related to the technical integration Build data Ingress or Egress pipelines , handling of huge volume of data and developing data transformation functions using languages such as SSIS, Python, Pyspark, SQL etc Integration of various data sources definitions like Teradata, SAP ERP, SQL Server, Oracle, Sybase, ODBC connectors & Flat Files through API or Batch Production Deployment and Hypercare : Assist with Production Deployment tasks; Assists with triage of issues, testing and identifying root cause; Carry out timely response and resolution of batch automation disruptions, in order to meet customer SLA s with accurate and on-time results Technical Leadership & Coding Oversight : Guide and review the code developed by junior consultants, ensuring alignment with best practices Incorporate o9 ways of working and embed the industry standards for smoother project executions What you should have: 3+ years experience in Data architecture, Data engineering, or a related field, with a strong focus on data modelling, ETL processes, and cloud-based data platforms Hands-on experience with Python, PySpark, SQL languages along with workflow management tools like Airflow, SSIS Experience working with Parquet, JSON, Restful APIs, HDFS, Delta Lake and query frameworks like Hive, Presto Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases Working experience with version control platforms, eg GitHub, Azure DevOps Familiarity with Agile methodology Proactive mindset and the right attitude to embrace the agility of learning Excellent verbal and written communication skills Good to have Hands-on Experience with Delta Lake Experience with Supply chain planning applications Experience with Amazon Web Services (AWS), AZURE, Google Cloud Infrastructures What we ll do for you Competitive salary with stock options to eligible candidates Stock options to eligible candidates Flat organization: With a very strong entrepreneurial culture (and no corporate politics) Great people and unlimited fun at work Possibility to make a difference in a scale-up environment Opportunity to travel onsite in specific phases depending on project requirements Support network: Work with a team you can learn from every day Diversity: We pride ourselves on our international working environment
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Hyderabad, Telangana, India
Salary: Not disclosed
11.0 - 16.0 Lacs P.A.
Hyderabad, Bengaluru
6.0 - 10.0 Lacs P.A.
Bengaluru
8.0 - 11.0 Lacs P.A.
Gurgaon, Haryana, India
Salary: Not disclosed
Vadodara, Gujarat, India
Salary: Not disclosed
Hyderābād
3.508 - 7.096 Lacs P.A.
6.0 - 11.0 Lacs P.A.
Gurgaon, Haryana, India
Salary: Not disclosed
Gurgaon, Haryana, India
Salary: Not disclosed