Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 7.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) –Senior- Snowflake As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for candidates with strong technology and data understanding in big data engineering space, having proven delivery capability. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Lead and Architect migration of data analytics environment from Teradata to Snowflake with performance and reliability Develop & deploy big data pipelines in a cloud environment using Snowflake cloud DW ETL design, development and migration of existing on-prem ETL routines to Cloud Service Interact with senior leaders, understand their business goals, contribute to the delivery of the workstreams Design and optimize model codes for faster execution Skills And Attributes For Success Hands on developer in the field of data warehousing, ETL Hands on development experience in Snowflake. Experience in Snowflake modelling - roles, schema, databases. Experience in Integrating with third-party tools, ETL, DBT tools Experience in Snowflake advanced concepts like setting up resource monitors and performance tuning would be preferable Applying object-oriented and functional programming styles to real-world Big Data Engineering problems using Java/Scala/Python Develop data pipelines to perform batch and Real - Time/Stream analytics on structured and unstructured data. Data processing patterns, distributed computing and in building applications for real-time and batch analytics. Collaborate with Product Management, Engineering, and Marketing to continuously improve Snowflake’s products and marketing. To qualify for the role, you must have Be a computer science graduate or equivalent with 3 - 7 years of industry experience Have working experience in an Agile base delivery methodology (Preferable) Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Be a technical expert on all aspects of Snowflake Deploy Snowflake following best practices, including ensuring knowledge transfer so that customers are properly enabled and are able to extend the capabilities of Snowflake on their own Work hands-on with customers to demonstrate and communicate implementation best practices on Snowflake technology Maintain deep understanding of competitive and complementary technologies and vendors and how to position Snowflake in relation to them Work with System Integrator consultants at a deep technical level to successfully position and deploy Snowflake in customer environments Provide guidance on how to resolve customer-specific technical challenges Ideally, you’ll also have Client management skills What We Look For Minimum 5 years of experience as Architect on Analytics solutions and around 2 years of experience with Snowflake. People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 1 month ago
6.0 - 11.0 years
15 - 25 Lacs
Bengaluru
Hybrid
Job Title: Teradata Developer Location: Bangalore Experience: 6+ Type: Full-time If you are interested please share me your updated resume on my official mail id Mohd.hashim@thehrsolutions.in Job Description: We are looking for a highly skilled Teradata Developer with deep expertise in SQL development, performance tuning, and UNIX scripting . The ideal candidate will be responsible for developing, optimizing, and maintaining complex SQL queries and Teradata processes to support enterprise-level data solutions. Key Responsibilities: Design, develop, and optimize complex SQL queries in Teradata. Work with large datasets to implement ETL processes and data pipelines. Develop and maintain UNIX shell scripts for job automation and data processing. Perform performance tuning and query optimization. Collaborate with business analysts and data engineers to meet data requirements. Required Skills: Expert-level SQL and Teradata development. Strong experience with UNIX Good understanding of data warehousing concepts.
Posted 1 month ago
3.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
You Lead the Way. We’ve Got Your Back. With the right backing, people and businesses have the power to progress in incredible ways. When you join Team Amex, you become part of a global and diverse community of colleagues with an unwavering commitment to back our customers, communities and each other. Here, you’ll learn and grow as we help you create a career journey that’s unique and meaningful to you with benefits, programs, and flexibility that support you personally and professionally. At American Express, you’ll be recognized for your contributions, leadership, and impact—every colleague has the opportunity to share in the company’s success. Together, we’ll win as a team, striving to uphold our company values and powerful backing promise to provide the world’s best customer experience every day. And we’ll do it with the utmost integrity, and in an environment where everyone is seen, heard and feels like they belong. Join Team Amex and let's lead the way together. Experience writing software in Python or similar. Experience with data structures, algorithms, and software design. Exposure to Data Science including Predictive Modelling. Develop Algorithms in multilingual conversational systems. Solve real-world scenarios for user commands and requests by identifying the right LLM models, tooling and frameworks. Proven experience in developing and working with large language models (GPT-3, BERT, T5, etc.) and productionizing them on the cloud. Strong foundation in machine learning concepts and techniques, including deep learning architectures, natural language processing, and text generation. Proficiency in programming languages such as Python, TensorFlow, PyTorch, and related libraries for model development and deployment. Demonstrated ability to design, train, fine-tune, and optimize large language models for specific tasks. Expertise in pre-processing and cleaning large datasets for training models. Familiarity with data augmentation techniques to enhance model performance. Knowledge of LLM operations , including evaluating model performance using appropriate metrics and benchmarks. Ability to iterate and improve models based on evaluation results. Experience in deploying language models in production environments and integrating them into applications, platforms, or services. Exposure in building Predictive models using machine learning through all phases of development, from design through training, evaluation, validation, and implementation. Experience with modern AI/ML & NLP Frameworks (e.g. Tensorflow), Dialogue Managers (e.g.Rasa), Search (e.g. Google Bert, GPT-3), Parsers (e.g. Dialogflow). Review architecture and provide technical guidance for engineers Perform statistical analysis of results and refine models Experience on various data architectures, latest tools, current and future trends in data engineering space especially Big Data, Streaming and Cloud technologies like GCP, AWS, Azure. Hands on experience with Big Data technologies (Spark, Kafka, Hive, etc.) and have at least 1 Big data implementation on platforms like Cornerstone, Teradata, etc. Experience with Visualization Tools like Tableau, Power BI, etc. Experience with complex, high volume, multi-dimensional data, as well as ML/AI models for unstructured, structured, and streaming datasets. Knowledge of cloud computing, including virtualization, hosted services, multi-tenant cloud infrastructures, storage systems, and content delivery networks Exposure in building cloud-native platforms on modern tech stack: AWS, Java, Spring Framework, RESTful API, and container-based application. Ability to learn new tools and paradigms in data engineering and science Proven experience attracting, hiring, retaining, and leading top engineering talent. Creative, passionate, and experienced leader of both people and technology Team Management savvy (e.g., planning, budgetary control, people management, vendor management, etc.). Experience with DevOps, reliability engineering, and platform monitoring Well versed in AGILE, DevOps and Program Management methods Bachelor's degree with a preference for Computer Science We back our colleagues and their loved ones with benefits and programs that support their holistic well-being. That means we prioritize their physical, financial, and mental health through each stage of life. Minimum Qualifications · 3 years of experience with applying Agile methodologies · Bachelor's degree in computer science, Engineering, Information Systems, or a related STEM field · 3+ years of experience with Java, microservices, React framework. · 3 years of experience with applying Agile methodologies · 1 year of experience with public cloud platform (GCP, AWS, ...) optimization, enabling managed and serverless service. Preferred Qualifications · Bachelor's degree in computer science, Engineering, Information Systems, or a related STEM field · 3 + years of experience with Python, microservices, React framework. · 3+ years of experience with Python, microservices, React framework. We back our colleagues and their loved ones with benefits and programs that support their holistic well-being. That means we prioritize their physical, financial, and mental health through each stage of life. Benefits include: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations. Show more Show less
Posted 1 month ago
3.0 years
0 Lacs
Telangana, India
On-site
Our company: At Teradata, we believe that people thrive when empowered with better information. That’s why we built the most complete cloud analytics and data platform for AI. By delivering harmonized data, trusted AI, and faster innovation, we uplift and empower our customers—and our customers’ customers—to make better, more confident decisions. The world’s top companies across every major industry trust Teradata to improve business performance, enrich customer experiences, and fully integrate data across the enterprise. What You’ll Do This role will require you to Design, develop, and maintain scalable and high-performing database features Write efficient, scalable, and clean code primarily in C/C++ Collaborate with cross-functional teams to define, design, and ship new features Ensure the availability, reliability, and performance of deployed applications Integrate with CI/CD pipelines to facilitate seamless deployment and development cycles Monitor and optimize application performance and troubleshoot issues as needed Evaluate, investigate, and tune/optimize the performance of the application Resolve customer incidents and provide support to Customer Support and Operations teams You will be successful on achieving measurable improvements in software performance and user satisfaction. Who You’ll Work With You will join a high performing engineering team with Emphasis on innovation, continuous learning, and open communication Strong focus on mutual respect and empowering team members Commitment to celebrating diverse perspectives and fostering professional growth This role is an Individual Contributor role closely working with team members, reports to Engineering Manager What Makes You a Qualified Candidate B. Tech/M. Tech/MCA in CSE disciplines 3-5 years of relevant industry experience Expert level knowledge in C/C++ Working experience with with data structures, REST API, parquet files in Linux environments Experience in one or more public cloud environments – AWS, Azure or Google Cloud Experience in Deltalake, Iceberg architecture, integration of Datalake to databases Work experience in providing the ability to read, write of catalog, metadata, manifestfile, manifestlist of Iceberg and Deltalake What You’ll Bring You will be a preferred candidate if you have Working knowledge of Teradata database A proactive and solution-oriented mindset with a passion for technology and continuous learning An ability to work independently and take initiative while contributing to the team’s success Creativity and adaptability in a dynamic environment A strong sense of ownership, accountability, and a drive to make an impact Why We Think You’ll Love Teradata We prioritize a people-first culture because we know our people are at the very heart of our success. We embrace a flexible work model because we trust our people to make decisions about how, when, and where they work. We focus on well-being because we care about our people and their ability to thrive both personally and professionally. We are an anti-racist company because our dedication to Diversity, Equity, and Inclusion is more than a statement. It is a deep commitment to doing the work to foster an equitable environment that celebrates people for all of who they are. Teradata invites all identities and backgrounds in the workplace. We work with deliberation and intent to ensure we are cultivating collaboration and inclusivity across our global organization. We are proud to be an equal opportunity and affirmative action employer. We do not discriminate based upon race, color, ancestry, religion, creed, sex (including pregnancy, childbirth, breastfeeding, or related conditions), national origin, sexual orientation, age, citizenship, marital status, disability, medical condition, genetic information, gender identity or expression, military and veteran status, or any other legally protected status. Show more Show less
Posted 1 month ago
10.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Our Company Teradata is the connected multi-cloud data platform for enterprise analytics company. Our enterprise analytics solve business challenges from start to scale. Only Teradata gives you the flexibility to handle the massive and mixed data workloads of the future, today. The Teradata Vantage architecture is cloud native, delivered as-a-service, and built on an open ecosystem. These design features make Vantage the ideal platform to optimize price performance in a multi-cloud environment. What You’ll Do The Principal Data Scientist (pre-sales) is an experienced and expert Data Scientist, able to provide industry thought-leadership on Analytics and its application across industries and across use-cases. The Principal Data Scientist supports the account team in framing business problems and in identifying analytic solutions that leverage Teradata technology and that are disruptive, innovative - and above all, practical. An articulate and compelling communicator, the Principal Data Scientist establishes our position as an important partner for advanced analytics with customers and prospects and is a trusted advisor to executives, senior managers and fellow data scientists alike across a range of target accounts. They are also a hands-on practitioner who is ready, willing and able to roll-up her sleeves and to deliver POC and short-term pre-sales engagements. The Principal Data Scientist has an excellent theoretical and practical understanding of statistics and machine learning and has a strong track record of applying this understanding at scale to drive business benefit. They are insanely curious and is a natural problem-solver and able to effectively promote Teradata technology and solutions to our customers. Who You’ll Work With The successful candidate will work with other expert team members to Provide pre-sales support at an executive level to the Teradata account teams at a local country, Geo and an International Theatre level. Helping them to position and sell complex Analytic solutions that drive sales of Teradata software. Provide strategic pre-sales consulting to executives and senior managers in our target market. Support the delivery of PoC and PoV projects that demonstrate the viability and applicability of Analytic use-cases and the superiority of Teradata solutions and services. Work with the extended Account team, and Sales Analytics Specialists to develop new Analytic propositions that are aligned with industry trends and customer requirements. What Makes You a Qualified Candidate Have proven hands-on experience of complex analytics at scale for example in the areas of IoT and sensor data. Have experience with Teradata partner’s analytical products, Cloud Service providers such as AzureML and Sagemaker and partner products such as Dataiku and H2O Have strong hands-on programming skills in at least one major analytic programming language and/or tool in addition to SQL Strong understanding of data engineering and database systems. Recognised in the local country, geo and International Theatre as the go-to expert What You’ll Bring An expertise in Data Science with a strong theoretical grounding in statistics, advanced analytics, and machine learning and at least 10 years real-world experience in the application of advanced analytics. A passion about knowledge sharing and demonstrate a commitment to continuous professional development. A Belief in Teradata's Analytic solutions and services and be a commitment to working with the product, engineering, and consulting teams to ensure that they continue to lead the market An ability to turn complex technical subject matter into relatable easy to digest and understand content for senior audiences. a degree level qualification (preferably Masters or PHD) in Statistics, Data Science, the physical or biological sciences or a related discipline Why We Think You’ll Love Teradata We prioritize a people-first culture because we know our people are at the very heart of our success. We embrace a flexible work model because we trust our people to make decisions about how, when, and where they work. We focus on well-being because we care about our people and their ability to thrive both personally and professionally. We are an anti-racist company because our dedication to Diversity, Equity, and Inclusion is more than a statement. It is a deep commitment to doing the work to foster an equitable environment that celebrates people for all of who they are. Teradata invites all identities and backgrounds in the workplace. We work with deliberation and intent to ensure we are cultivating collaboration and inclusivity across our global organization. We are proud to be an equal opportunity and affirmative action employer. We do not discriminate based upon race, color, ancestry, religion, creed, sex (including pregnancy, childbirth, breastfeeding, or related conditions), national origin, sexual orientation, age, citizenship, marital status, disability, medical condition, genetic information, gender identity or expression, military and veteran status, or any other legally protected status. Show more Show less
Posted 1 month ago
10.0 years
0 Lacs
Maharashtra, India
On-site
Our Company Teradata is the connected multi-cloud data platform for enterprise analytics company. Our enterprise analytics solve business challenges from start to scale. Only Teradata gives you the flexibility to handle the massive and mixed data workloads of the future, today. The Teradata Vantage architecture is cloud native, delivered as-a-service, and built on an open ecosystem. These design features make Vantage the ideal platform to optimize price performance in a multi-cloud environment. What You’ll Do The Principal Data Scientist (pre-sales) is an experienced and expert Data Scientist, able to provide industry thought-leadership on Analytics and its application across industries and across use-cases. The Principal Data Scientist supports the account team in framing business problems and in identifying analytic solutions that leverage Teradata technology and that are disruptive, innovative - and above all, practical. An articulate and compelling communicator, the Principal Data Scientist establishes our position as an important partner for advanced analytics with customers and prospects and is a trusted advisor to executives, senior managers and fellow data scientists alike across a range of target accounts. They are also a hands-on practitioner who is ready, willing and able to roll-up her sleeves and to deliver POC and short-term pre-sales engagements. The Principal Data Scientist has an excellent theoretical and practical understanding of statistics and machine learning and has a strong track record of applying this understanding at scale to drive business benefit. They are insanely curious and is a natural problem-solver and able to effectively promote Teradata technology and solutions to our customers. Who You’ll Work With The successful candidate will work with other expert team members to Provide pre-sales support at an executive level to the Teradata account teams at a local country, Geo and an International Theatre level. Helping them to position and sell complex Analytic solutions that drive sales of Teradata software. Provide strategic pre-sales consulting to executives and senior managers in our target market. Support the delivery of PoC and PoV projects that demonstrate the viability and applicability of Analytic use-cases and the superiority of Teradata solutions and services. Work with the extended Account team, and Sales Analytics Specialists to develop new Analytic propositions that are aligned with industry trends and customer requirements. What Makes You a Qualified Candidate Have proven hands-on experience of complex analytics at scale for example in the areas of IoT and sensor data. Have experience with Teradata partner’s analytical products, Cloud Service providers such as AzureML and Sagemaker and partner products such as Dataiku and H2O Have strong hands-on programming skills in at least one major analytic programming language and/or tool in addition to SQL Strong understanding of data engineering and database systems. Recognised in the local country, geo and International Theatre as the go-to expert What You’ll Bring An expertise in Data Science with a strong theoretical grounding in statistics, advanced analytics, and machine learning and at least 10 years real-world experience in the application of advanced analytics. A passion about knowledge sharing and demonstrate a commitment to continuous professional development. A Belief in Teradata's Analytic solutions and services and be a commitment to working with the product, engineering, and consulting teams to ensure that they continue to lead the market An ability to turn complex technical subject matter into relatable easy to digest and understand content for senior audiences. a degree level qualification (preferably Masters or PHD) in Statistics, Data Science, the physical or biological sciences or a related discipline Why We Think You’ll Love Teradata We prioritize a people-first culture because we know our people are at the very heart of our success. We embrace a flexible work model because we trust our people to make decisions about how, when, and where they work. We focus on well-being because we care about our people and their ability to thrive both personally and professionally. We are an anti-racist company because our dedication to Diversity, Equity, and Inclusion is more than a statement. It is a deep commitment to doing the work to foster an equitable environment that celebrates people for all of who they are. Teradata invites all identities and backgrounds in the workplace. We work with deliberation and intent to ensure we are cultivating collaboration and inclusivity across our global organization. We are proud to be an equal opportunity and affirmative action employer. We do not discriminate based upon race, color, ancestry, religion, creed, sex (including pregnancy, childbirth, breastfeeding, or related conditions), national origin, sexual orientation, age, citizenship, marital status, disability, medical condition, genetic information, gender identity or expression, military and veteran status, or any other legally protected status. Show more Show less
Posted 1 month ago
10.0 years
0 Lacs
Navi Mumbai, Maharashtra, India
On-site
Our Company Teradata is the connected multi-cloud data platform for enterprise analytics company. Our enterprise analytics solve business challenges from start to scale. Only Teradata gives you the flexibility to handle the massive and mixed data workloads of the future, today. The Teradata Vantage architecture is cloud native, delivered as-a-service, and built on an open ecosystem. These design features make Vantage the ideal platform to optimize price performance in a multi-cloud environment. What You’ll Do The Principal Data Scientist (pre-sales) is an experienced and expert Data Scientist, able to provide industry thought-leadership on Analytics and its application across industries and across use-cases. The Principal Data Scientist supports the account team in framing business problems and in identifying analytic solutions that leverage Teradata technology and that are disruptive, innovative - and above all, practical. An articulate and compelling communicator, the Principal Data Scientist establishes our position as an important partner for advanced analytics with customers and prospects and is a trusted advisor to executives, senior managers and fellow data scientists alike across a range of target accounts. They are also a hands-on practitioner who is ready, willing and able to roll-up her sleeves and to deliver POC and short-term pre-sales engagements. The Principal Data Scientist has an excellent theoretical and practical understanding of statistics and machine learning and has a strong track record of applying this understanding at scale to drive business benefit. They are insanely curious and is a natural problem-solver and able to effectively promote Teradata technology and solutions to our customers. Who You’ll Work With The successful candidate will work with other expert team members to Provide pre-sales support at an executive level to the Teradata account teams at a local country, Geo and an International Theatre level. Helping them to position and sell complex Analytic solutions that drive sales of Teradata software. Provide strategic pre-sales consulting to executives and senior managers in our target market. Support the delivery of PoC and PoV projects that demonstrate the viability and applicability of Analytic use-cases and the superiority of Teradata solutions and services. Work with the extended Account team, and Sales Analytics Specialists to develop new Analytic propositions that are aligned with industry trends and customer requirements. What Makes You a Qualified Candidate Have proven hands-on experience of complex analytics at scale for example in the areas of IoT and sensor data. Have experience with Teradata partner’s analytical products, Cloud Service providers such as AzureML and Sagemaker and partner products such as Dataiku and H2O Have strong hands-on programming skills in at least one major analytic programming language and/or tool in addition to SQL Strong understanding of data engineering and database systems. Recognised in the local country, geo and International Theatre as the go-to expert What You’ll Bring An expertise in Data Science with a strong theoretical grounding in statistics, advanced analytics, and machine learning and at least 10 years real-world experience in the application of advanced analytics. A passion about knowledge sharing and demonstrate a commitment to continuous professional development. A Belief in Teradata's Analytic solutions and services and be a commitment to working with the product, engineering, and consulting teams to ensure that they continue to lead the market An ability to turn complex technical subject matter into relatable easy to digest and understand content for senior audiences. a degree level qualification (preferably Masters or PHD) in Statistics, Data Science, the physical or biological sciences or a related discipline Why We Think You’ll Love Teradata We prioritize a people-first culture because we know our people are at the very heart of our success. We embrace a flexible work model because we trust our people to make decisions about how, when, and where they work. We focus on well-being because we care about our people and their ability to thrive both personally and professionally. We are an anti-racist company because our dedication to Diversity, Equity, and Inclusion is more than a statement. It is a deep commitment to doing the work to foster an equitable environment that celebrates people for all of who they are. Teradata invites all identities and backgrounds in the workplace. We work with deliberation and intent to ensure we are cultivating collaboration and inclusivity across our global organization. We are proud to be an equal opportunity and affirmative action employer. We do not discriminate based upon race, color, ancestry, religion, creed, sex (including pregnancy, childbirth, breastfeeding, or related conditions), national origin, sexual orientation, age, citizenship, marital status, disability, medical condition, genetic information, gender identity or expression, military and veteran status, or any other legally protected status. Show more Show less
Posted 1 month ago
6.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Position - Data Engineer Location - Pune Experience - 6+ years Must Have: Tech-savvy engineer - willing and able to learn new skills, track industry trend 6+ years of total experience of solid data engineering experience, especially in open-source, data-intensive, distributed environments with experience in Big data-related technologies like Spark, Hive, HBase, Scala, etc. Programming background – preferred Scala / Python. Experience in Scala, Spark, PySpark and Java (Good to have). Experience in migration of data to AWS or any other cloud. Experience in SQL and NoSQL databases. Optional: Model the data set from Teradata to the cloud. Experience in Building ETL Pipelines Experience in Building Data pipelines in AWS (S3, EC2, EMR, Athena, Redshift) or any other cloud. Self-starter & resourceful personality with the ability to manage pressure situations Exposure to Scrum and Agile Development Best Practices Experience working with geographically distributed teams Role & Responsibilities: Build Data and ETL pipelines in AWS Support migration of data to the cloud using Big Data Technologies like Spark, Hive, Talend, Python Interact with customers on a daily basis to ensure smooth engagement Responsible for timely and quality deliveries. Fulfill organization responsibilities – Sharing knowledge and experience within the other groups in the organization, conducting various technical sessions and training. Education: Bachelor’s degree in computer science, Software Engineering, MIS or equivalent combination of education and experience Every individual comes with a different set of skills and qualities so even if you don’t tick all the boxes for the role today, we urge you to apply as there might be a suitable/unique role for you tomorrow! Facebook | Twitter | LinkedIn | Instagram | Youtube Show more Show less
Posted 1 month ago
3.0 - 5.0 years
0 Lacs
Navi Mumbai, Maharashtra, India
On-site
Our Company At Teradata, we believe that people thrive when empowered with better information. That’s why we built the most complete cloud analytics and data platform for AI. By delivering harmonized data, trusted AI, and faster innovation, we uplift and empower our customers—and our customers’ customers—to make better, more confident decisions. The world’s top companies across every major industry trust Teradata to improve business performance, enrich customer experiences, and fully integrate data across the enterprise. What You'll Do Teradata is looking to add a new Program Manager / Product Owner to our existing Global Sales Operations Tools & Technologies team of analytical, problem solving and solution-oriented product owners and program managers with experience supporting sales teams. Day to day focus is on implementation, adoption, hygiene and documenting best practices while being on the leading edge of developing and representing business requirements for our Sales and Channel Partner Cloud Platform and other sales technologies. This position will work closely with Sales and GTM Leadership, Account Teams, Partner Team (Global Alliances / Client Relationship Management), Sales Operations Managers, Technology and Enablement teams, Marketing, IT to define and deliver channel partner technology solutions and business processes aligned with our strategy and roadmap. The ideal candidate will be data driven, intellectually curious, a fast learner, and able to move quickly while maintaining focus on high impact projects aligned to a global strategy and to develop and make recommendations on business technology and business process improvements. This is a full-time individual contributor position based in a Teradata office in India. Responsibilities: Product Owner for assigned capability / program area representing the business stakeholder(s) and/or customer(s) and process owner for such designated areas and capabilities Define, document, and share CRM best practices to ensure sales processes and terminology are consistently understood and applied across the organization and regions Develop and make recommendations to business process improvements and impacts to different business / sales and partner areas Build and manage relationships with cross-functional teams such as Geographic Sales Leadership and Sales Operations Managers, Marketing, and IT to ensure that tools and technologies are set up and aligned to effectively support Teradata’s coverage models around the world. Work closely with Sales Enablement to identify training needs for leadership and account team members on technology, tools, business practices and processes. Actively participate in roadmap identification and prioritization with Business and IT partners managing all phases of the program / project delivery cycle and consult / bring recommendations for programs / projects. Determine the business impact of current and future technologies for the GTM Organization Who You'll Work With You will interact directly with field sales, sales leaders, and other team members to capture feedback for sales technology and process improvements to drive adoption and deliver business value. What Makes You a Qualified Candidate 3-5 years of experience as an Agile / Scrum product or process owner experience or 3-5 years Sales Operations, Sales Support, or Sales Field Impacting role experience. Direct experience in managing and driving value from CRM (Salesforce. com) and sales tools and leading / partnering cross-functionally to deliver complex programs and projects. Experience with direct sales and resellers/distribution partner processes in a SaaS/Cloud enterprise company or software vendor, and knowledge of how these processes integrate into existing systems/tools. Experience with Salesforce Partner Relationship Management (PRM), Salesforce Communities, Partner platforms and a good understanding of the different channel partner types is a plus. Must possess business acumen, field facing acumen, strong analytical, troubleshooting, problem-solving, and project management skills. Proactive and passionate: Independently capable of seeking information, solving conceptual problems, corralling resources, and delivering results in challenging situations. Ability to manage multiple concurrent projects and drive initiatives in a cross-functional environment. Solution Business Consulting skills, including analysis/evaluation of business and/or system processes and functional recommendation highly desired. Experience working and communicating with senior executives to solve complex business problems. Bachelor’s Degree in an analytical field (e. g. Computer Science, Information Systems, Business Administration, Engineering, Mathematics, or Statistics) What You Will Bring Project/Program Management, or Agile / Product Owner Certification a plus, but not required Salesforce or PRM Certifications a plus but not required Why We Think You’ll Love Teradata We prioritize a people-first culture because we know our people are at the very heart of our success. We embrace a flexible work model because we trust our people to make decisions about how, when, and where they work. We focus on well-being because we care about our people and their ability to thrive both personally and professionally. We are an anti-racist company because our dedication to Diversity, Equity, and Inclusion is more than a statement. It is a deep commitment to doing the work to foster an equitable environment that celebrates people for all of who they are. Teradata invites all identities and backgrounds in the workplace. We work with deliberation and intent to ensure we are cultivating collaboration and inclusivity across our global organization. We are proud to be an equal opportunity and affirmative action employer. We do not discriminate based upon race, color, ancestry, religion, creed, sex (including pregnancy, childbirth, breastfeeding, or related conditions), national origin, sexual orientation, age, citizenship, marital status, disability, medical condition, genetic information, gender identity or expression, military and veteran status, or any other legally protected status. Show more Show less
Posted 1 month ago
2.0 years
0 Lacs
Gurugram, Haryana
On-site
Location Gurugram, Haryana, India Category Corporate Job Id GGN00002085 Marketing / Loyalty / Mileage Plus / Alliances Job Type Full-Time Posted Date 06/05/2025 Achieving our goals starts with supporting yours. Grow your career, access top-tier health and wellness benefits, build lasting connections with your team and our customers, and travel the world using our extensive route network. Come join us to create what’s next. Let’s define tomorrow, together. Description Our Marketing and Loyalty team is the strategic force behind United’s industry-leading brand and experience, supporting revenue growth by turning customers into lifelong United flyers. Our marketing communications, market research and brand teams drive travelers’ familiarity and engagement with the United brand. Our product, design and service teams bring the signature United brand to life in our clubs and onboard our aircraft, ensuring a seamless, premier experience. And when customers choose United again and again, that’s because the loyalty team has been hard at work crafting an award-winning program. Our loyalty team manages United MileagePlus®, building travel and lifestyle partnerships that customers can engage with every day, and works with our Star Alliance partners to ensure United can take you anywhere you want to go. Job overview and responsibilities United Airlines reaches out to customers and potential travelers via digital campaigns with new information, travel inspiration, personalized offers, promos, etc. The Digital Marketing & Personalized Offers team at IKC supports all such digital acquisition initiatives with insights to help strategize campaigns and analytics to help measure performance. We work closely with stakeholders in the US to bring these campaigns to life and continuously improve performance with learnings and actionable insights. Assist in campaign planning, targeting and audience identification; measure campaign results and performance using data analysis Gather and organize data from various sources using SQL/ Python/ R; continuously develop and demonstrate improved analysis methodologies Create content for and deliver presentations to United leadership and external stakeholders Ensure alignment and prioritization with business objectives and initiatives – help teams make faster, smarter decisions Conduct exploratory analysis, identify opportunities, and proactively suggest initiatives to meet marketing objectives Create, modify and automate reports and dashboards - take ownership of reporting structure and metrics, clearly and effectively communicate relevant information to decision makers using data visualization tools Ensure seamless stakeholder management and keep lines of communication open with all stakeholders This position is offered on local terms and conditions. Expatriate assignments and sponsorship for employment visas, even on a time-limited visa status, will not be awarded. This position is for United Airlines Business Services Pvt. Ltd - a wholly owned subsidiary of United Airlines Inc. Qualifications What’s needed to succeed (Minimum Qualifications): Bachelor's degree 2+ years of experience in Analytics and working with analytical tools Proven comfort and an intellectual curiosity for working with very large data sets Experience in manipulating and analyzing complex, high-volume, high-dimensionality data from various sources to highlight patterns and relationships Proficiency in using database querying tools and writing complex queries and procedures using Teradata SQL and/ or Microsoft SQL Familiarity with one or more reporting tools – Spotfire/ Tableau/ PowerBI Advanced level comfort with Microsoft Office, especially Excel and PowerPoint Ability to communicate analysis in a clear and precise manner High sense of ownership of work Ability to work under time constraints Must be legally authorized to work in India for any employer without sponsorship Must be fluent in English (written and spoken) Successful completion of interview required to meet job qualification Reliable, punctual attendance is an essential function of the position What will help you propel from the pack (Preferred Qualifications): Master's degree Bachelor’s Degree in a quantitative field like Math, Statistics, Analytics and/ or Business SQL/ Python/ R Visualization tools – Tableau/ Spotfire/ PowerBI Understanding of digital acquisition channels Strong knowledge of either Python or R
Posted 1 month ago
0.0 - 4.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Tesco India • Bengaluru, Karnataka, India • Hybrid • Full-Time • Permanent • Apply by 30-Jun-2025 About the role Responsible to provide support via automation while devising efficient reporting solutions in alignment with customer and business needs What is in it for you At Tesco, we are committed to providing the best for you. As a result, our colleagues enjoy a unique, differentiated, market- competitive reward package, based on the current industry practices, for all the work they put into serving our customers, communities and planet a little better every day. Our Tesco Rewards framework consists of pillars - Fixed Pay, Incentives, and Benefits. Total Rewards offered at Tesco is determined by four principles - simple, fair, competitive, and sustainable. Salary - Your fixed pay is the guaranteed pay as per your contract of employment. Performance Bonus - Opportunity to earn additional compensation bonus based on performance, paid annually Leave & Time-off - Colleagues are entitled to 30 days of leave (18 days of Earned Leave, 12 days of Casual/Sick Leave) and 10 national and festival holidays, as per the company’s policy. Making Retirement Tension-FreeSalary - In addition to Statutory retirement beneets, Tesco enables colleagues to participate in voluntary programmes like NPS and VPF. Health is Wealth - Tesco promotes programmes that support a culture of health and wellness including insurance for colleagues and their family. Our medical insurance provides coverage for dependents including parents or in-laws. Mental Wellbeing - We offer mental health support through self-help tools, community groups, ally networks, face-to-face counselling, and more for both colleagues and dependents. Financial Wellbeing - Through our financial literacy partner, we offer one-to-one financial coaching at discounted rates, as well as salary advances on earned wages upon request. Save As You Earn (SAYE) - Our SAYE programme allows colleagues to transition from being employees to Tesco shareholders through a structured 3-year savings plan. Physical Wellbeing - Our green campus promotes physical wellbeing with facilities that include a cricket pitch, football field, badminton and volleyball courts, along with indoor games, encouraging a healthier lifestyle. You will be responsible for Understands business needs and in depth understanding of Tesco processes Accountable for high quality and timely completion of specified reporting & dash-boarding work Understanding the end to end process of generating reports Understanding the underlying data sources Action any change request received from partners Develop users manual for reporting procedures and related process changes Handle new report development requests Lead the transformation of reports into new age tools and technologies Provide solutions to issues related to reports development and delivery Maintain the log of issues, risks and mitigation plans Identifying operational improvements and apply solution and automation using Python, Alteryx Enhance and Develop Daily, Weekly and Periodic reports and dashboards using Advanced excel, Advanced SQL, Hadoop, Teradata Partnering with stakeholders to identify problems, collaborate with them to brainstorm on the best possible reporting solution, and deliver solutions in the form of intelligent business reports / dashboards (Tableau, BI) Following our Business Code of Conduct and always acting with integrity and due diligence You will need - 2-4 year Experience in analytics delivery in any one of domains like retail, cpg, telecom or hospitality and for one of the following functional areas - marketing, supply chain, customer, merchandising, operations, finance or digital preferred Adv Excel, Strong Verbal and Written Communication Adv SQL, Big Data Infra, Hadoop, Hive, Phython, Spark Automation platforms Alteryx/Python Advanced Developer knowledge of Tableau, PowerBI, Logical Reasoning Eye for detail About us Tesco in Bengaluru is a multi-disciplinary team serving our customers, communities, and planet a little better every day across markets. Our goal is to create a sustainable competitive advantage for Tesco by standardising processes, delivering cost savings, enabling agility through technological solutions, and empowering our colleagues to do even more for our customers. With cross-functional expertise, a wide network of teams, and strong governance, we reduce complexity, thereby offering high-quality services for our customers. Tesco in Bengaluru, established in 2004 to enable standardisation and build centralised capabilities and competencies, makes the experience better for our millions of customers worldwide and simpler for over 3,30,000 colleagues. Tesco Business Solutions: Established in 2017, Tesco Business Solutions (TBS) has evolved from a single entity traditional shared services in Bengaluru, India (from 2004) to a global, purpose-driven solutions-focused organisation. TBS is committed to driving scale at speed and delivering value to the Tesco Group through the power of decision science. With over 4,400 highly skilled colleagues globally, TBS supports markets and business units across four locations in the UK, India, Hungary, and the Republic of Ireland. The organisation underpins everything that the Tesco Group does, bringing innovation, a solutions mindset, and agility to its operations and support functions, building winning partnerships across the business. TBS's focus is on adding value and creating impactful outcomes that shape the future of the business. TBS creates a sustainable competitive advantage for the Tesco Group by becoming the partner of choice for talent, transformation, and value creation
Posted 1 month ago
3.0 - 8.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Position: Data Analyst Location: Gurgaon Timings: 12:00 PM to 10:00 PM Role Overview Doing independent research, analyze, and present data as assigned Expected to work in close collaboration with the EXL team and clients on Commercial insurance actuarial projects for US/UK markets Should be able to understand risk and underwriting , plicate rating methodology Develop and use collaborative relationship to facilitate the accomplishment of working goals Working experience in P&C insurance domain for US insurance markets is a must Excellent written and verbal communication skills Facilitate data requirements while working with actuaries Have excellent SQL skills to extract data for scheduled processes and adhoc requests Automate manual processes and ETL pipelines using Python Utilise/help migrate existing SAS processes from SAS to SAS Viya Key Responsibilities Collaborate with actuaries to understand their data and reporting needs related to premium, loss, and exporsure analysis. Build and optimize complex SQL queries to extract, join, and aggregate large datasets from multiple relational sources. Develop and automate data pipelines in Python for ETL ,data wrangling , and wexploratory analytics. Use SAS for legacy processes, statistical outputs, and ad hoc data manipulation as required by actuarial models/processes Validate data outputs for accuracy and consistency, troubleshoot discrepancies, and ensure data quality before delivery Create documentation of data logc, process flows, and metadata over confluence and SharePoint to ensure transparency and knowledge sharing. Contribute to continuous improvement by recommending process automation or optimization opportunities in existing workflows Support Dashboarding or visualization needs(optional) using tools like Powe BI. Work in an agile or iterative environment with clear communication or progress, blockers and timelines. Required Skillset SQL(Expert Level) : Complex Joins, subqueries, window functions, CTEs, query Optimization and performance tuning, working with large tables in cloud/on-premise environments( Teradata, SQL Server, or equivalent) Python( intermediate to expert): Data wrangling using pandas, NumPy, Script automation and API Consumption, Familiarity with Visual Studio, Jupyter and modular Python Scripting SAS(Intermediate): Reading/ writing from/to datasets, connecting with external sources, macros, PROC SQL Knowledge of AWS is preferred Experience with commercial insurance Understanding of actuarial concepts such as loss triangles, reserving, and pricing Exposure to Git, JIRA, Confluence Proficiency in Excel, VBA Macros(Preferred) Candidate Profile Bachelor’s/Master's degree in engineering, economics, mathematics, actuarial sciences, or similar technical degree. Master’s in business or financial management is also suitable Affiliation to IAI or IFOA, with at least 3 actuarial exams 3-8 years’ experience in data analytics in insurance or financial service industry with good understanding of actuarial concepts - pricing, reserving, and/or valuation Demonstrated ability to work with actuarial or statistical teams in delivering high-quality data and insights Strong problem solving attitude and comfort with ambiguity in requirements Strong ability to learn technical and business knowledge Outstanding written and verbal communication skills Excellent time and work management skills Able to work in fast pace continuously evolving environment and ready to take up uphill challenges Is able to understand cross cultural differences and can work with clients across the globe Show more Show less
Posted 1 month ago
6.0 - 11.0 years
8 - 13 Lacs
Bengaluru
Work from Office
Overall experience of 6 years in DW/ BI technologies and minimum 5 years development experience in ETL DataStage 8.x/ 9.x tool. Design, develop, and maintain ETL processes using IBM DataStage to extract, transform, and load data from multiple sources into our data warehouse Collaborate with business analysts and stakeholders to understand data requirements and translate them into technical specifications Worked extensively in parallel jobs, sequences and preferably in routines. Good conceptual knowledge on data- warehouse and various methodologies. Strong SQL database skills in Teradata and other databases like Oracle, SQL Server, DB2 etc Working knowledge in UNIX shell scripting. Good communication and presentation skills Should be flexible with the overlapping working hours Should be able to work independently and act proactively Develop, implement, and maintain best practices for DataStage Mandatory skills* DataStage , SQL Desired skills* Unix , PL/SQL
Posted 1 month ago
10.0 - 12.0 years
35 - 40 Lacs
Bengaluru
Work from Office
Gracenote, a Nielsen company, is dedicated to connecting audiences to the entertainment they love, powering a better media future for all people. Gracenote is the content data business unit of Nielsen that powers innovative entertainment experiences for the world s leading media companies. Our entertainment metadata and connected IDs deliver advanced content navigation and discovery to connect consumers to the content they love and discover new ones. Gracenote s industry-leading datasets cover TV programs, movies, sports, music and podcasts in 80 countries and 35 languages. Common identifiers Universally adopted by the world s leading media companies to deliver powerful cross-media entertainment experiences. Machine driven, human validated best-in-class data and images fuel new search and discovery experiences across every screen. Gracenotes Data Organization is a dynamic and innovative group that is essential in delivering business outcomes through data, insights, predictive & prescriptive analytics. An extremely motivated team that values creativity, experimentation through continuous learning in an agile and collaborative manner. From designing, developing and maintaining data architecture that satisfies our business goals to managing data governance and region-specific regulations, the data team oversees the whole data lifecycle. Role Overview We are seeking an experienced Senior Data Engineer with 10-12 years of experience to join our Video engineering team with Gracenote - a NielsenIQ Company. In this role, you will design, build, and maintain our data processing systems and pipelines. You will work closely with Product managers, Architects, analysts, and other stakeholders to ensure data is accessible, reliable, and optimized for Business, analytical and operational needs. Key Responsibilities Design, develop, and maintain scalable data pipelines and ETL processes Architect and implement data warehousing solutions and data lakes Optimize data flow and collection for cross-functional teams Build infrastructure required for optimal extraction, transformation, and loading of data Ensure data quality, reliability, and integrity across all data systems Collaborate with data scientists and analysts to help implement models and algorithms Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, etc. Create and maintain comprehensive technical documentation Mentor junior engineers and provide technical leadership Evaluate and integrate new data management technologies and tools Implement Optimization strategies to enable and maintain sub second latency. Oversee Data infrastructure to ensure robust deployment and monitoring of the pipelines and processes. Stay ahead of emerging trends in Data, cloud, integrating new research into practical applications. Mentor and grow a team of junior data engineers. Required Qualifications and Skills Expert-level proficiency in Python, SQL, and big data tools (Spark, Kafka, Airflow).Bachelors degree in Computer Science, Engineering, or related field; Masters degree preferred Expert knowledge of SQL and experience with relational databases (e.g., PostgreSQL, Redshift, TIDB, MySQL, Oracle, Teradata) Extensive experience with big data technologies (e.g., Hadoop, Spark, Hive, Flink)Proficiency in at least one programming language such as Python, Java, or Scala Experience with data modeling, data warehousing, and building ETL pipelines Strong knowledge of data pipeline and workflow management tools (e.g., Airflow, Luigi, NiFi)Experience with cloud platforms (AWS, Azure, or GCP) and their data services. AWS Preferred Hands on Experience with building streaming pipelines with flink, Kafka, Kinesis. Flink Preferred. Understanding of data governance and data security principles Experience with version control systems (e.g., Git) and CI/CD practices Proven leadership skills in grooming data engineering teams. Preferred Skills Experience with containerization and orchestration tools (Docker, Kubernetes)Basic knowledge of machine learning workflows and MLOps Experience with NoSQL databases (MongoDB, Cassandra, etc.) Familiarity with data visualization tools (Tableau, Power BI, etc.)Experience with real-time data processing Knowledge of data governance frameworks and compliance requirements (GDPR, CCPA, etc.) Experience with infrastructure-as-code tools (Terraform, CloudFormation)
Posted 1 month ago
5.0 - 10.0 years
7 - 12 Lacs
Bengaluru
Work from Office
Looking for associate with 5+ years of hands on experience in Informatica Power Center/ETL Experience in batch monitoring and trouble shooting, impact analysis and batch recovery Good hands on experience in SQL and RDBMS/teradata Proficient in working on scheduler Autosys/TWS/Control-M Basic knowledge of Unix Strong analytical skills, triaging engaging the application teams, support groups and DBAs
Posted 1 month ago
3.0 - 6.0 years
4 - 7 Lacs
Bengaluru
Work from Office
Job Description: Value Preposition Responsible for designing and building data pipelines for enterprise data through ETL/ELT processes. Develop and maintain large-scale data platforms, data lakes and cloud solutions. Job Details Position Title: Data Engineer II Career Level: P2 Job Category: Senior Associate Role Type: Hybrid Job Location: Bengaluru About the Team: The data engineering team is community of dedicated professionals committed to designing, building, and maintaining data platform solutions for the organization. Impact (Job Summary/Why this Role Matters) Enterprise data warehouse supports several critical business functions for the bank including Regulatory Reporting, Finance, Risk steering, and Customer 360. This role is vital for building and maintaining enterprise data platform, data processes, and to support business objectives. Our values inclusivity, transparency, and excellence drive everything we do. Join us and make a meaningful impact on the organization. Key Deliverables (Duties and Responsibilities) Responsible for building and maintaining data platform that supports data integrations for Enterprise Data Warehouse, Operational Data Store or Data Marts etc. with appropriate data access, data security, data privacy and data governance. Create data ingestion pipelines in data warehouses and other large-scale data platforms. Create Data Ingestion pipeline for a variety of sources - File (Flat, delimited, Excel), DB, API (With Apigee integration), and SharePoint. Build reusable Data pipelines / frameworks using Python. Creating scheduled as well as trigger-based ingestion patterns using scheduling tools. Create performance optimized DDLs for any row-based or columnar databases such as Oracle, Postgres, Netezza database per Logical Data Model. Performance tuning of complex data pipelines and SQL queries. Performs impact analysis of proposed changes on existing architecture, capabilities, system priorities, and technology solutions. Working in Agile Framework, participating in various agile ceremonies, co-ordination with scrum master, tech lead, and PO on sprint planning, backlog creation, refinement, demo, and retrospection. Working with Product Owners to understand PI goals, PI planning, requirement clarification, and delivery coordination. Technical support for production incidents and failures Work with global technology teams across different time zones (primarily US) to deliver timely business value. Skills and Qualification (Functional and Technical Skills) Functional Skills: 5+ years of experience, 3+ years relevant to Snowflake. Team Player: Support peers, team, and department management. Communication: Excellent verbal, written, and interpersonal communication skills. Problem Solving: Excellent problem-solving skills, incident management, root cause analysis, and proactive solutions to improve quality. Partnership and Collaboration: Develop and maintain partnership with business and IT stakeholders Attention to Detail: Ensure accuracy and thoroughness in all tasks. Technical/Business Skills: Data Engineering: Experience in designing and building Data Warehouse and Data lakes. Good knowledge of data warehouse principles, and concepts. Technical expertise working in large scale Data Warehousing applications and databases such as Oracle, Netezza, Teradata, and SQL Server. Experience with public cloud-based data platforms especially Snowflake and AWS. Data integration skills: Expertise in design and development of complex data pipelines Solutions using any industry leading ETL tools such as SAP Business Objects Data Services (BODS), Informatica Cloud Data Integration Services (IICS), IBM Data Stage. Knowledge of ELT tools such as DBT, Fivetran, and AWS Glue Expert in SQL - development experience in at least one scripting language (Python etc.), adept in tracing and resolving data integrity issues. Data Model: Knowledge of Logical and Physical Data Model using Relational or Dimensional Modeling practices, high volume ETL/ELT processes. Performance tuning of data pipelines and DB Objects to deliver optimal performance. Experience in Gitlab version control and CI/CD processes. Experience working in Financial Industry is a plus. Relationships & Collaboration Reports to: Associate Director - Data Engineering Partners: Senior leaders and cross-functional teams Collaborates: A team of Data Engineering associates Accessibility Needs We are committed to providing an inclusive and accessible hiring process. If you require accommodations at any stage (e.g. application, interviews, onboarding) please let us know, and we will work with you to ensure a seamless experience.
Posted 1 month ago
8.0 - 13.0 years
50 - 55 Lacs
Hyderabad
Work from Office
Do you pioneer? Do you enjoy solving complex problems in building and analyzing large datasets? Do you enjoy focusing first on your customer and working backwards? Amazon transportation controllership team is looking for an experienced Data Engineering Manager with experience in architecting large/complex data systems with a strong record of achieving results, scoping and delivering large projects end-to-end. You will be the key driver in building out our vision for scalable data systems to support the ever-growing Amazon global transportation network businesses. As a Data Engineering Manager in Transportation Controllership, you will be at the forefront of managing large projects, providing vision to the team, designing and planning large financial data systems that will allow our businesses to scale world-wide. You should have deep expertise in the database design, management, and business use of extremely large datasets, including using AWS technologies - Redshift, S3, EC2, Data-pipeline and other big data technologies. Above all you should be passionate about data warehousing large datasets together to answer business questions and drive change. You should have excellent business acumen and communication skills to be able to work with multiple business teams, and be comfortable communicating with senior leadership. Due to the breadth of the areas of business, you will coordinate across many internal and external teams, and provide visibility to the senior leaders of the company with your strong written and oral communication skills. We need individuals with demonstrated ability to learn quickly, think big, execute both strategically and tactically, motivate and mentor their team to deliver business values to our customers on time. A day in the life On a daily basis you will: manage and help GROW a team of high performing engineers understand new business requirements and architect data engineering solutions for the same plan your teams priorities, working with relevant internal/external stakeholders, including sprint planning resolve impediments faced by the team update leadership as needed use judgement in making the right tactical and strategic decisions for the team and organization monitor health of the databases and ingestion pipelines - 2+ years of processing data with a massively parallel technology (such as Redshift, Teradata, Netezza, Spark or Hadoop based big data solution) experience - 2+ years of relational database technology (such as Redshift, Oracle, MySQL or MS SQL) experience - 2+ years of developing and operating large-scale data structures for business intelligence analytics (using ETL/ELT processes) experience - 5+ years of data engineering experience - Experience managing a data or BI team - Experience communicating to senior management and customers verbally and in writing - Experience leading and influencing the data or BI strategy of your team or organization - Experience in at least one modern scripting or programming language, such as Python, Java, Scala, or NodeJS - Experience with big data technologies such as: Hadoop, Hive, Spark, EMR - Experience with AWS Tools and Technologies (Redshift, S3, EC2)
Posted 1 month ago
3.0 years
0 Lacs
Gurugram, Haryana, India
On-site
hackajob is collaborating with American Express to connect them with exceptional tech professionals for this role. You Lead the Way. We’ve Got Your Back. With the right backing, people and businesses have the power to progress in incredible ways. When you join Team Amex, you become part of a global and diverse community of colleagues with an unwavering commitment to back our customers, communities and each other. Here, you’ll learn and grow as we help you create a career journey that’s unique and meaningful to you with benefits, programs, and flexibility that support you personally and professionally. At American Express, you’ll be recognized for your contributions, leadership, and impact—every colleague has the opportunity to share in the company’s success. Together, we’ll win as a team, striving to uphold our company values and powerful backing promise to provide the world’s best customer experience every day. And we’ll do it with the utmost integrity, and in an environment where everyone is seen, heard and feels like they belong. Join Team Amex and let's lead the way together. Experience writing software in Python or similar. Experience with data structures, algorithms, and software design. Exposure to Data Science including Predictive Modelling. Develop Algorithms in multilingual conversational systems. Solve real-world scenarios for user commands and requests by identifying the right LLM models, tooling and frameworks. Proven experience in developing and working with large language models (GPT-3, BERT, T5, etc.) and productionizing them on the cloud. Strong foundation in machine learning concepts and techniques, including deep learning architectures, natural language processing, and text generation. Proficiency in programming languages such as Python, TensorFlow, PyTorch, and related libraries for model development and deployment. Demonstrated ability to design, train, fine-tune, and optimize large language models for specific tasks. Expertise in pre-processing and cleaning large datasets for training models. Familiarity with data augmentation techniques to enhance model performance. Knowledge of LLM operations , including evaluating model performance using appropriate metrics and benchmarks. Ability to iterate and improve models based on evaluation results. Experience in deploying language models in production environments and integrating them into applications, platforms, or services. Exposure in building Predictive models using machine learning through all phases of development, from design through training, evaluation, validation, and implementation. Experience with modern AI/ML & NLP Frameworks (e.g. Tensorflow), Dialogue Managers (e.g.Rasa), Search (e.g. Google Bert, GPT-3), Parsers (e.g. Dialogflow). Review architecture and provide technical guidance for engineers Perform statistical analysis of results and refine models Experience on various data architectures, latest tools, current and future trends in data engineering space especially Big Data, Streaming and Cloud technologies like GCP, AWS, Azure. Hands on experience with Big Data technologies (Spark, Kafka, Hive, etc.) and have at least 1 Big data implementation on platforms like Cornerstone, Teradata, etc. Experience with Visualization Tools like Tableau, Power BI, etc. Experience with complex, high volume, multi-dimensional data, as well as ML/AI models for unstructured, structured, and streaming datasets. Knowledge of cloud computing, including virtualization, hosted services, multi-tenant cloud infrastructures, storage systems, and content delivery networks Exposure in building cloud-native platforms on modern tech stack: AWS, Java, Spring Framework, RESTful API, and container-based application. Ability to learn new tools and paradigms in data engineering and science Proven experience attracting, hiring, retaining, and leading top engineering talent. Creative, passionate, and experienced leader of both people and technology Team Management savvy (e.g., planning, budgetary control, people management, vendor management, etc.). Experience with DevOps, reliability engineering, and platform monitoring Well versed in AGILE, DevOps and Program Management methods Bachelor's degree with a preference for Computer Science We back our colleagues and their loved ones with benefits and programs that support their holistic well-being. That means we prioritize their physical, financial, and mental health through each stage of life. Minimum Qualifications 3 years of experience with applying Agile methodologies Bachelor's degree in computer science, Engineering, Information Systems, or a related STEM field 3+ years of experience with Java, microservices, React framework. 3 years of experience with applying Agile methodologies 1 year of experience with public cloud platform (GCP, AWS, ...) optimization, enabling managed and serverless service. Preferred Qualifications Bachelor's degree in computer science, Engineering, Information Systems, or a related STEM field 3+ years of experience with Python, microservices, React framework. 3+ years of experience with Python, microservices, React framework. Benefits We back our colleagues and their loved ones with benefits and programs that support their holistic well-being. That means we prioritize their physical, financial, and mental health through each stage of life. Benefits include: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations. Show more Show less
Posted 1 month ago
4.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Achieving our goals starts with supporting yours. Grow your career, access top-tier health and wellness benefits, build lasting connections with your team and our customers, and travel the world using our extensive route network. Come join us to create what’s next. Let’s define tomorrow, together. Description Our Marketing and Loyalty team is the strategic force behind United’s industry-leading brand and experience, supporting revenue growth by turning customers into lifelong United flyers. Our marketing communications, market research and brand teams drive travelers’ familiarity and engagement with the United brand. Our product, design and service teams bring the signature United brand to life in our clubs and onboard our aircraft, ensuring a seamless, premier experience. And when customers choose United again and again, that’s because the loyalty team has been hard at work crafting an award-winning program. Our loyalty team manages United MileagePlus®, building travel and lifestyle partnerships that customers can engage with every day, and works with our Star Alliance partners to ensure United can take you anywhere you want to go. Job Overview And Responsibilities United Airlines reaches out to customers and potential travelers via digital campaigns with new information, travel inspiration, personalized offers, promos, etc. The Digital Marketing & Personalized Offers team at IKC supports all such digital acquisition initiatives with insights to help strategize campaigns and analytics to help measure performance. We work closely with stakeholders in the US to bring these campaigns to life and continuously improve performance with learnings and actionable insights. Ensure alignment and prioritization with business objectives and initiatives – help teams make faster, smarter decisions Conduct exploratory analysis, identify opportunities, and proactively suggest initiatives to meet marketing objectives Assist in campaign planning, targeting and audience identification; measure campaign results and performance using data analysis Create content for and deliver presentations to United leadership and external stakeholders Own workstreams to deliver results, while leading other team members Ensure seamless stakeholder management and keep lines of communication open with all stakeholders Create, modify and automate reports and dashboards - take ownership of reporting structure and metrics, clearly and effectively communicate relevant information to decision makers using data visualization tools This position is offered on local terms and conditions. Expatriate assignments and sponsorship for employment visas, even on a time-limited visa status, will not be awarded. This position is for United Airlines Business Services Pvt. Ltd - a wholly owned subsidiary of United Airlines Inc. Qualifications What’s needed to succeed (Minimum Qualifications): Bachelor's degree or 4 years of relevant work experience 4+ years of experience in Analytics and working with analytical tools Proven comfort and an intellectual curiosity for working with very large data sets Experience in manipulating and analyzing complex, high-volume, high-dimensionality data from various sources to highlight patterns and relationships Proficiency in using database querying tools and writing complex queries and procedures using Teradata SQL and/ or Microsoft SQL Familiarity with one or more reporting tools – Spotfire/ Tableau Advanced level comfort with Microsoft Office, especially Excel and PowerPoint Ability to communicate analysis in a clear and precise manner High sense of ownership of work, and ability to lead a team Ability to work under time constraints Must be legally authorized to work in India for any employer without sponsorship Must be fluent in English (written and spoken) Successful completion of interview required to meet job qualification Reliable, punctual attendance is an essential function of the position What will help you propel from the pack (Preferred Qualifications): Master's degree Bachelor’s Degree in a quantitative field like Math, Statistics, Analytics and/ or Business SQL/ Python/ R Visualization tools – Tableau/ Spotfire Understanding of digital acquisition channels Strong knowledge of either Python or R GGN00002080 Show more Show less
Posted 1 month ago
0 years
0 Lacs
Kochi, Kerala, India
On-site
Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities As an Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In This Role, Your Responsibilities May Include Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours Preferred Education Master's Degree Required Technical And Professional Expertise Design and implement efficient database schemas and data models using Teradata. Optimize SQL queries and stored procedures for performance. Perform database administration tasks including installation, configuration, and maintenance of Teradata systems Preferred Technical And Professional Experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences Show more Show less
Posted 1 month ago
8.0 - 10.0 years
20 - 35 Lacs
Pune
Work from Office
Sr Solutions Engineer About the job Mid-Level Position based out of Pune (5-7 years) Senior Solutions Engineer Job Description : We are looking for a Developer good in Python, Spark along with ETL, Complex SQL, Cloud. Primary Skills: Python, Database, Spark Secondary: Azure/AWS, APIs In This Role, You Will Develop and maintain scalable and efficient backend systems, ensuring high performance and responsiveness to requests from the front-end. Design and implement cloud-based solutions, primarily on Microsoft Azure. Manage and optimize CI/CD pipelines for rapid and reliable deployment of software updates. Collaborate with frontend developers and other team members to establish objectives and design more functional, cohesive codes to enhance the user experience. Develop and maintain databases and server-side applications. Ensure the security of the backend infrastructure. Preferred Qualifications 5-7 years of experience developing with Python. Experience in Automation using Python. Experience building rest APIs. Experience with mobile application development is advantageous. Experience working within a Cloud environment. Bachelor's degree in computer science or a related field or related experience Experience with CI/CD tools like Jenkins, GitLab CI, or Azure DevOps. In-depth understanding of database technologies (SQL and NoSQL) and web server technologies. Familiarity with containerization and orchestration tools (e.g., Docker, Kubernetes).
Posted 1 month ago
3.0 years
5 - 8 Lacs
Hyderābād
On-site
We are seeking an experienced and motivated Data Engineer to join our team. In this role, you will design, build, and maintain scalable data solutions to support critical business needs. You will work with distributed data platforms, cloud infrastructure, and modern data engineering tools to enable efficient data processing, storage, and analytics. The role includes participation in an on-call rotation to ensure the reliability and availability of our systems and pipelines Key Responsibilities Data Platform Development : Design, develop, and maintain data pipelines and workflows on distributed data platforms such as BigQuery, Hadoop/EMR/DataProc, or Teradata. Cloud Integration: Build and optimize cloud-based solutions using AWS or GCP to process and store large-scale datasets. Workflow Orchestration: Design and manage workflows and data pipelines using Apache Airflow to ensure scalability, reliability, and maintainability. Containerization and Orchestration : Deploy and manage containerized applications using Kubernetes for efficient scalability and resource management. Event Streaming : Work with Kafka to implement reliable and scalable event streaming systems for real-time data processing. Programming and Automation : Write clean, efficient, and maintainable code in Python and SQL to automate data processing, transformation, and analytics tasks. Database Management : Design and optimize relational and non-relational databases to support high-performance querying and analytics. System Monitoring & Troubleshooting: Participate in the on-call rotation to monitor systems, address incidents, and ensure the reliability of production environments. Collaboration : Work closely with cross-functional teams, including data scientists, analysts, and product managers, to understand data requirements and deliver solutions that meet business objectives. Participate in code reviews, technical discussions, and team collaboration to deliver high-quality software solutions. This role includes participation in an on-call rotation to ensure the reliability and performance of production systems: Rotation Schedule : Weekly rotation beginning Tuesday at 9:00 PM PST through Monday at 9:00 AM PST. Responsibilities During On-Call : Monitor system health and respond to alerts promptly. Troubleshoot and resolve incidents to minimize downtime. Escalate issues as needed and document resolutions for future reference. Requirements: Primary Technologies: Big Query or other distributed data platform, for example, Big Data (Hadoop/EMR/DataProc), SnowFlake, Teradata, or Netezza, ASW, GCP, Kubernetes, Kafka Python, SQL Bachelor’s degree in computer science, Engineering, or a related field (or equivalent work experience). 3+ years of experience in data engineering or related roles. Hands-on experience with distributed data platforms such as BigQuery, Hadoop/EMR/DataProc, Snowflake, or Teradata. Proficiency in Apache Airflow for building and orchestrating workflows and data pipelines. Proficiency in Python and SQL for data processing and analysis. Experience with cloud platforms like AWS or GCP, including building scalable solutions. Familiarity with Kubernetes for container orchestration. Knowledge of Kafka for event streaming and real-time data pipelines. Strong problem-solving skills and ability to troubleshoot complex systems. Excellent communication and collaboration skills to work effectively in a team environment. Preferred Familiarity with CI/CD pipelines for automated deployments. Knowledge of data governance, security, and compliance best practices. Experience with DevOps practices and tools. We have a global team of amazing individuals working on highly innovative enterprise projects & products. Our customer base includes Fortune 100 retail and CPG companies, leading store chains, fast growth fintech, and multiple Silicon Valley startups. What makes Confiz stand out is our focus on processes and culture. Confiz is ISO 9001:2015 (QMS), ISO 27001:2022 (ISMS), ISO 20000-1:2018 (ITSM) and ISO 14001:2015 (EMS) Certified. We have a vibrant culture of learning via collaboration and making workplace fun. People who work with us work with cutting-edge technologies while contributing success to the company as well as to themselves. To know more about Confiz Limited, visit https://www.linkedin.com/company/confiz/
Posted 1 month ago
5.0 years
0 Lacs
Hyderābād
Remote
Overview: As an Analyst, Data Modeler, your focus would be to partner with D&A Data Foundation team members to create data models for Global projects. This would include independently analysing project data needs, identifying data storage and integration needs/issues, and driving opportunities for data model reuse, satisfying project requirements. Role will advocate Enterprise Architecture, Data Design, and D&A standards, and best practices. You will be performing all aspects of Data Modeling working closely with Data Governance, Data Engineering and Data Architects teams. As a member of the data Modeling team, you will create data models for very large and complex data applications in public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. The primary responsibilities of this role are to work with data product owners, data management owners, and data engineering teams to create physical and logical data models with an extensible philosophy to support future, unknown use cases with minimal rework. You'll be working in a hybrid environment with in-house, on-premises data sources as well as cloud and remote systems. You will establish data design patterns that will drive flexible, scalable, and efficient data models to maximize value and reuse. Responsibilities: Complete conceptual, logical and physical data models for any supported platform, including SQL Data Warehouse, EMR, Spark, Data Bricks, Snowflake, Azure Synapse or other Cloud data warehousing technologies. Governs data design/modeling – documentation of metadata (business definitions of entities and attributes) and constructions database objects, for baseline and investment funded projects, as assigned. Provides and/or supports data analysis, requirements gathering, solution development, and design reviews for enhancements to, or new, applications/reporting. Supports assigned project contractors (both on- & off-shore), orienting new contractors to standards, best practices, and tools. Contributes to project cost estimates, working with senior members of team to evaluate the size and complexity of the changes or new development. Ensure physical and logical data models are designed with an extensible philosophy to support future, unknown use cases with minimal rework. Develop a deep understanding of the business domain and enterprise technology inventory to craft a solution roadmap that achieves business objectives, maximizes reuse. Partner with IT, data engineering and other teams to ensure the enterprise data model incorporates key dimensions needed for the proper management: business and financial policies, security, local-market regulatory rules, consumer privacy by design principles (PII management) and all linked across fundamental identity foundations. Drive collaborative reviews of design, code, data, security features implementation performed by data engineers to drive data product development. Assist with data planning, sourcing, collection, profiling, and transformation. Create Source To Target Mappings for ETL and BI developers. Show expertise for data at all levels: low-latency, relational, and unstructured data stores; analytical and data lakes; data streaming (consumption/production), data in-transit. Develop reusable data models based on cloud-centric, code-first approaches to data management and cleansing. Partner with the Data Governance team to standardize their classification of unstructured data into standard structures for data discovery and action by business customers and stakeholders. Support data lineage and mapping of source system data to canonical data stores for research, analysis and productization. Qualifications: Bachelor’s degree required in Computer Science, Data Management/Analytics/Science, Information Systems, Software Engineering or related Technology Discipline. 5+ years of overall technology experience that includes at least 2+ years of data modeling and systems architecture. Around 2+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 2+ years of experience developing enterprise data models. Experience in building solutions in the retail or in the supply chain space. Expertise in data modeling tools (ER/Studio, Erwin, IDM/ARDM models). Experience with integration of multi cloud services (Azure) with on-premises technologies. Experience with data profiling and data quality tools like Apache Griffin, Deequ, and Great Expectations. Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one MPP database technology such as Redshift, Synapse, Teradata or Snowflake. Experience with version control systems like GitHub and deployment & CI tools. Experience with Azure Data Factory, Databricks and Azure Machine learning is a plus. Experience of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including Dev Ops and Data Ops concepts. Familiarity with business intelligence tools (such as Power BI). Excellent verbal and written communication and collaboration skills.
Posted 1 month ago
0 years
15 - 25 Lacs
Gurgaon
On-site
Overview C5i C5i is a pure-play AI & Analytics provider that combines the power of human perspective with AI technology to deliver trustworthy intelligence. The company drives value through a comprehensive solution set, integrating multifunctional teams that have technical and business domain expertise with a robust suite of products, solutions, and accelerators tailored for various horizontal and industry-specific use cases. At the core, C5i’s focus is to deliver business impact at speed and scale by driving adoption of AI-assisted decision-making. C5i caters to some of the world’s largest enterprises, including many Fortune 500 companies. The company’s clients span Technology, Media, and Telecom (TMT), Pharma & Lifesciences, CPG, Retail, Banking, and other sectors. C5i has been recognized by leading industry analysts like Gartner and Forrester for its Analytics and AI capabilities and proprietary AI-based platforms. Global offices United States | Canada | United Kingdom | United Arab of Emirates | India Job Summary We are looking for experienced Data Modelers to support large-scale data engineering and analytics initiatives. The role involves developing logical and physical data models, working closely with business and engineering teams to define data requirements, and ensuring alignment with enterprise standards. Independently complete conceptual, logical and physical data models for any supported platform, including SQL Data Warehouse, Spark, Data Bricks Delta Lakehouse or other Cloud data warehousing technologies. Governs data design/modelling – documentation of metadata (business definitions of entities and attributes) and constructions database objects, for baseline and investment funded projects, as assigned. Develop a deep understanding of the business domains like Customer, Sales, Finance, Supplier, and enterprise technology inventory to craft a solution roadmap that achieves business objectives, maximizes reuse. Drive collaborative reviews of data model design, code, data, security features to drive data product development. Show expertise for data at all levels: low-latency, relational, and unstructured data stores; analytical and data lakes; SAP Data Model. Develop reusable data models based on cloud-centric, code-first approaches to data management and data mapping. Partner with the data stewards team for data discovery and action by business customers and stakeholders. Provides and/or supports data analysis, requirements gathering, solution development, and design reviews for enhancements to, or new, applications/reporting. Assist with data planning, sourcing, collection, profiling, and transformation. Support data lineage and mapping of source system data to canonical data stores. Create Source to Target Mappings (STTM) for ETL and BI developers. Skills needed: Expertise in data modelling tools (ER/Studio, Erwin, IDM/ARDM models, CPG / Manufacturing/Sales/Finance/Supplier/Customer domains ) Experience with at least one MPP database technology such as Databricks Lakehouse, Redshift, Synapse, Teradata, or Snowflake. Experience with version control systems like GitHub and deployment & CI tools. Experience of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Working knowledge of SAP data models, particularly in the context of HANA and S/4HANA, Retails Data like IRI, Nielsen Retail. C5i is proud to be an equal opportunity employer. We are committed to equal employment opportunity regardless of race, color, religion, sex, sexual orientation, age, marital status, disability, gender identity, etc. If you have a disability or special need that requires accommodation, please keep us informed about the same at the hiring stages for us to factor necessary accommodations. Job Type: Full-time Pay: ₹1,500,000.00 - ₹2,500,000.00 per year Work Location: In person
Posted 1 month ago
4.0 - 9.0 years
6 - 11 Lacs
Chennai
Work from Office
Responsibility: Developand set up the transformation of data from sources to enable analysis anddecision making. Maintain data flow from source to the designated target without affecting the crucialdata flow and to play a critical part in the data supply chain, by ensuringstakeholders can access and manipulate data for routine and ad hoc analysis. Implement projects focused on collecting, aggregating, storing, reconciling, and makingdata accessible from disparate sources. Provide support during the full lifecycle of data from ingesting through analytics toaction. Analyzeand organize raw data. Evaluate business needs and objectives. Interprettrends and patterns. Conduct complex data analysis and report on results. Coordinate with source team and end-user and develop solutions. Implementdata governance policies and support data-versioning processes. Maintain security and dataprivacy. Requirements Must Have: Proven hands-on experience inbuilding complex analytical queries in Teradata. 4+ years of extensive programming experience in Teradata Tools and Utilities. Hands-on experience in Teradata utilities such as Fast Load, Multi Load, BTEQ, and TPT. Experience in data quality management and best practices across data solution implementations. Experience in development testing and deployment, coding standards, and best practices. Experience in preparing technical design documentation. Strong team collaboration and experience working with remote teams. Knowledge in data modelling and database management such as performance tuning of the Enterprise Data Warehouse, Data Mart, and Business Intelligence Reporting environments, andsupport the integration of those systems with other applications. Good to have: Should be good in Unix Shellscripting. Experience in DataTransformation using ETL/ELT tools. Experience in differentrelational databases (i.e. Teradata, Oracle, PostgreSQL). experience with CI/CD development and deployment tools (i.e. Maven, Jenkins, Git, Kubernetes).
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
32455 Jobs | Dublin
Wipro
16590 Jobs | Bengaluru
EY
11025 Jobs | London
Accenture in India
10991 Jobs | Dublin 2
Amazon
8878 Jobs | Seattle,WA
Uplers
8715 Jobs | Ahmedabad
IBM
8204 Jobs | Armonk
Oracle
7750 Jobs | Redwood City
Capgemini
6181 Jobs | Paris,France
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi