Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
7 - 11 years
22 - 25 Lacs
Hyderabad
Work from Office
Tecovas is the first direct-to-consumer western brand, founded with the simple goal of making the worlds best western boots, apparel and leather goods - and selling them at a fair price. We are a brand revolutionizing a category and welcoming first-time boot buyers and western enthusiasts alike. Tecovas is looking for a SeniorData Engineer to joinour growing and dynamic Data Team. This position will play an integral role in democratizing data access and use across all departments at Tecovas. Reporting directly to the Director of Data, you will be helping to build out the companys Data pipelines, Data Warehouse, and other Data products and play a key role in ensuring Tecovas has a best in class data practice. This candidate is strongly encouraged to work from our HQ office in Austin, TX with the ability to work remotely on other days. What youll do: Develop and maintain scalable and efficient ELT pipelines to gather and store data across all departments at Tecovas Coordinate cross functionally to ensure all relevant data is captured for analysis and reporting Collaborate with Data Science, Analytics, Core Systems and the rest of the Tech team to support advanced data projects Assist in maintaining and improving data transformation models in dbt Advance data monitoring, security, and compliance efforts Manage Airflow, Cloud Functions, and other cloud infrastructure to ensure cost effective solutions with minimal downtime Improve internal tech documentation and business facing documentation / data dictionary Develop and support Data Science and Advanced Analytics pipelines with creative and unique data engineering solutions Experience were looking for: Bachelor's degree in computer science, engineering, or a related field 5+ years of experience as a data engineer, data scientist, or data analyst Expertise with modern Data Engineering best practices including CDC, observability, quality testing, and performance and cost optimization Strong experience with Python, SQL, Git Strong Experience with dbt Experience with Fivetran, Stitch, or other ETL/ELT tools Experience with BigQuery, Airflow, and other cloud based engineering tools. We use GCP butother relevant experience will be considered Excellent interpersonal and communication skills What you bring to the table: You are highly organized and a self-starter. You feel confident working in a fast-paced environment. You are able to quickly learn new systems and implement new procedures. You can easily collaborate with cross-functional partners. You have a positive attitude and are motivated by a challenge.
Posted 3 months ago
7 - 12 years
18 - 30 Lacs
Pune, Delhi NCR, Bengaluru
Hybrid
Role & responsibilities Design, develop, and maintain robust and scalable data pipelines using modern ETL/ELT tools and techniques. Implement and manage data orchestration tools such as DBT, Fivetran, Stitch, or Matillion. Build and optimize data models for various analytical and reporting needs. Ensure data quality and integrity through rigorous testing and validation. Monitor and troubleshoot data pipelines and infrastructure, proactively identifying and resolving issues. Collaborate with data scientists and analysts to understand their data requirements and provide support. Stay up-to-date with the latest data engineering trends and technologies. Contribute to the development and improvement of our data engineering best practices. Mentor junior data engineers and provide technical guidance. Participate in code reviews and contribute to a collaborative development environment. Document data pipelines and infrastructure for maintainability and knowledge sharing. Contribute to the architecture and design of our overall data platform. Qualifications: Bachelor's degree in Computer Science, Engineering, or a related field. 7+ years of proven experience as a Data Engineer, preferably in a fast-paced environment. Deep understanding of data warehousing concepts and best practices. Hands-on experience with at least one data orchestration tool (DBT, Fivetran, Stitch, Matillion). Proficiency in SQL and extensive experience with data modeling. Experience with cloud-based data warehousing solutions (e.g., Snowflake, BigQuery, Redshift). Experience with programming languages like Python or Scala is highly preferred.
Posted 3 months ago
5 - 8 years
12 - 22 Lacs
Coimbatore
Remote
About the Role: We are looking for a skilled Freelance BI/Analytics Engineer to join our team and help implement and maintain financial reporting systems based on data from an online marketplace and ERP systems. You will work with Snowflake as the data warehouse and Looker for BI and reporting. This role will involve building and maintaining scheduled jobs, creating DataMarts, and supporting ad hoc data analysis needs across various business use cases. If you have experience with financial data analysis, data warehousing, and business intelligence tools (especially Looker), and are comfortable working remotely, wed love to hear from you! Key Responsibilities: Implement financial reports using data from an online marketplace and ERP systems in a Snowflake Data Warehouse and Looker setup. Build and maintain scheduled jobs for automating data processes and reporting tasks using any job scheduler (e.g., cron, AirFlow or any Job Scheduler). Create and maintain DataMarts that support financial reports and business data analysis. Provide ad-hoc data analysis for various business use cases, including financial, operational, and strategic insights. Collaborate with business and technical teams to define reporting needs and ensure data needs for business requirements. Manage access control mechanisms for sensitive financial and accounting data, ensuring data security and integrity. Troubleshoot and optimize reporting pipelines and jobs to improve performance and accuracy. Must-Have Skills: Strong proficiency in SQL, with experience in a Snowflake Data Warehouse environment. Proven experience in Data Warehousing, particularly the creation and small to medium DataMarts ( Star/Snowflake Schema ) Experience working with large datasets in a Marketplace or FinTech environment. Expertise in developing and maintaining financial reports using BI tools (with Looker preferred). Familiarity with financial or accounting data, including reporting and analysis of such data. Ability to work independently, remotely, and collaboratively in a distributed team. Availability to work at least 4-5 hours during US Central Morning Hours (8:00 PM to 1:00 AM IST) on all working days. Excellent communication skills to work effectively with US Based clients, including understanding requirements, providing clear updates, and addressing queries Nice-to-Have Skills: Experience with modern ETL tools like FiveTran, DBT, or similar. Familiarity with advanced data analytics and dashboarding capabilities in Looker or other BI platforms. Experience with other job schedulers (e.g., AirFlow, cron) for task automation
Posted 3 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2