Jobs
Interviews

2 Etl Deployment Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 12.0 years

0 Lacs

hyderabad, telangana

On-site

As a Teradata Analyst at our company, you will leverage your 8+ years of experience to handle data analysis, ETL deployment, and Datawarehouse support using tools like Informatica Power center, SQL, Teradata, and Unix. Your primary responsibilities will include applying data analysis techniques, data modeling, and quality assurance methods to establish, modify, or maintain data structures and associated components. You will design, code, test, correct, and document complex programs or processes based on agreed specifications and standards to achieve well-engineered results. In this role, you will design and construct large ETL mappings, packages workflows, and deliver efficient and scalable Data Warehouse solutions in a high data growth environment. Your expertise in technology leadership with hands-on experience in Teradata and Informatica will be crucial. You should also have a strong understanding of data warehousing and ETL concepts, proficient skills in SQL and PL/SQL coding and scripting, and experience in requirement analysis and architecture design. Collaboration with onsite teams for requirement gathering, development, and testing will be part of your responsibilities. Your in-depth knowledge of Agile processes and principles will be beneficial in overseeing team activities related to coding, unit testing, system testing, and defect resolution. Excellent communication, presentation, organizational, and time management skills are essential for this role. In return, we offer a competitive salary and benefits package, a culture focused on talent development with quarterly promotion cycles, company-sponsored higher education, and certifications. You will have the opportunity to work with cutting-edge technologies and participate in employee engagement initiatives like project parties, flexible work hours, Long Service awards, and annual health check-ups. Insurance coverage including group term life, personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents is provided. At our company, we foster a values-driven and people-centric work environment that enables employees to accelerate both professional and personal growth, make a positive impact using the latest technologies, enjoy collaborative innovation, and unlock global opportunities to work and learn with the industry's best. Join us at Persistent to unleash your full potential and be part of an Equal Opportunity Employer that prohibits discrimination and harassment of any kind.,

Posted 1 week ago

Apply

3.0 - 8.0 years

12 - 20 Lacs

Bengaluru

Hybrid

Role- Senior Data Science Engineer Sabre is the global leader in innovative technology that leads the travel industry. We are always looking for bright and driven people who have a penchant for technology, are hackers at heart and want to hone their skills. If you are interested in challenging work, being part of a global team, and solving complex problems through technology, business intelligence and analytics, and Agile practices - then Sabre is right for you! It is our people who develop and deliver powerful solutions that meet the current and future needs for our airline, hotel, and travel agency customers. Sabre is seeking a talented senior software engineer full Senior Data Science Engineer for SabreMosaic Team. At Sabre, were passionate about building data science and data engineering solves problems. In this role you will plan, design, develop and test data science/data engineer software systems or applications for software enhancements and new products based on cloud-based solutions. Role and Responsibilities: Develops, codes, tests and debugs new complex data driven software solutions or enhancements to existing product Designs, plans, develops and improves applications using advanced cloud native technology Works on issues where analysis of situations or data requires an in-depth knowledge of organizational objectives. Implements strategic policies when selecting methods, techniques Encourage high coding standards, using best practices and high quality Regularly interacts with subordinate supervisors, architects, product managers, HR, on matters concerning projects, or team performance. Requires the ability to change the thinking of, or gain acceptance from others in sensitive situations, without damage to the relationship Provides technical mentorship and cultural/competency-based guidance to teams Provides larger business/product context. Mentors on specific tech stacks/technologies Qualifications and Education Requirements: Minimum 4-6 years of related experience as a full stack developer Expert in the field of Data Engineering/DW projects with Google Cloud based Data Engineering solutions Designing and developing enterprise data solutions on GCP cloud platform with native or third-party data technologies. Good working experience with relational databases and NoSQL databases including but not limited to Oracle, Spanner, BigQuery etc. Expert level SQL skills for data manipulation (DML), data validation and data manipulation Experience in design and development of data modeling, data design, data warehouses, data lakes and analytics platforms on GCP Expertise in designing ETL data pipelines and data processing architectures for Datawarehouse Experience in technical design and building both Streaming and batch processing systems Good experience in Datawarehouse in designing Star & Snowflake Schemas and knowledge of Dimensional Data Modelling Work with data scientists, data team and engineering teams to use Google Cloud platform to analyze data, build data models on Big query, big table etc Working experience in Integrating different datasets from multiple data sources for data modelling for analytical and AI/ML models Take ownership of production deployment of code Understanding and experience in Pub/Sub, Kafka, Kubernetes, GCP, AWS, Hive, Docker Expertise in Java spring boot / Python or other programming languages used for Data Engineering and integration projects Strong problem-solving and analytical skills AI/ML exposure, MLOPS and Vertex AI is a great advantage Familiarity with DevOps practices like CICD pipeline Airline domain experience is a plus Excellent spoken and written communication skills GCP Cloud Data Engineer Professional is plus

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies