Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 7.0 years
5 - 9 Lacs
Chennai
Work from Office
Job Title Data Analyst Expert Job Description Job title: Data Analyst Expert Your role: Analyzes complex and multi-dimensional datasets across various business units to uncover deep insights that inform high-level strategic decisions and drive business performance. Creates sophisticated data models and predictive analytics using advanced statistical techniques and machine learning algorithms to forecast trends and optimize business outcomes, working under general supervision. Implements scalable ETL (Extract, Transform, Load) pipelines and robust data integration solutions, ensuring data consistency, accuracy, and reliability across the organization. Provides inputs on data architecture and infrastructure improvements, advising on best practices for data storage, retrieval, and management to enhance system performance and scalability. Ensures compliance with global data privacy regulations (such as GDPR and HIPAA) by designing and enforcing comprehensive data governance frameworks and security protocols. Conducts advanced statistical analyses, including hypothesis testing and multivariate analysis, to validate business strategies and provide evidence-based recommendations for process enhancements. Evaluates the effectiveness of implemented data solutions using performance metrics and feedback loops, continuously refining analytical approaches to improve outcomes. Benchmarks existing data analytics tools and technologies, leading the selection and integration of state-of-the-art solutions to maintain a competitive edge and drive innovation. Supports the execution of enterprise-wide data strategies, providing thought leadership and expert guidance in data analytics, business intelligence, and data science methodologies. Identifies opportunities for automation and process optimization, leveraging artificial intelligence, machine learning, and advanced analytics to enhance data processing efficiency and decision-making accuracy. Youre the right fit if: (4 x bullets max) 1. Experience- 5+ Industry Experience in Data Analysis, SQL, ETL 2. Skills: Data Analysis & Interpretation Data Harmonization & Processing Statistical Methods Statistical Programming Software Business Intelligence Tools Data Mining Machine Learning Engineering Fundamentals Research & Analysis Structured Query Language (SQL) Regulatory Compliance 3. Education- Bachelors degree in Any Engineering 4. Anything else- Must have strong communication skill. How we work together We believe that we are better together than apart. For our office-based teams, this means working in-person at least 3 days per week. Onsite roles require full-time presence in the company s facilities. Field roles are most effectively done outside of the company s main facilities, generally at the customers or suppliers locations. Indicate if this role is an office/field/onsite role. If you re interested in this role and have many, but not all, of the experiences needed, we encourage you to apply. You may still be the right candidate for this or other opportunities at Philips. Learn more about our culture of impact with care here .
Posted 2 weeks ago
16.0 - 18.0 years
50 - 60 Lacs
Bengaluru
Work from Office
Join us as a Data Engineer You ll be the voice of our customers, using data to tell their stories and put them at the heart of all decision-making We ll look to you to drive the build of effortless, digital first customer experiences If you re ready for a new challenge and want to make a far-reaching impact through your work, this could be the opportunity you re looking for We are offering this role at vice president level What youll do As a Data Engineer, you ll be looking to simplify our organisation by developing innovative data driven solutions through data pipelines, modelling and ETL design, inspiring to be commercially successful while keeping our customers, and the bank s data, safe and secure. You ll drive customer value by understanding complex business problems and requirements to correctly apply the most appropriate and reusable tool to gather and build data solutions. You ll support our strategic direction by engaging with the data engineering community to deliver opportunities, along with carrying out complex data engineering tasks to build a scalable data architecture. Your responsibilities will also include: Building advanced automation of data engineering pipelines through removal of manual stages Embedding new data techniques into our business through role modelling, training, and experiment design oversight Delivering a clear understanding of data platform costs to meet your departments cost saving and income targets Sourcing new data using the most appropriate tooling for the situation Developing solutions for streaming data ingestion and transformations in line with our streaming strategy The skills youll need To thrive in this role, you ll need a strong understanding of data usage and dependencies and experience of extracting value and features from large scale data. You ll also bring practical experience of programming languages alongside knowledge of data and software engineering fundamentals. Additionally, you ll need: Expertise in in data engineering toolsets such as Airflow, RDBMs(PGSQL/Oracle/DB2), Snowflake, S3, EMR/DataBricks and Data Pipelines Proven proficiency in Python, PySpark, SQL, CICD pipelines, Git version control Experience working with reporting tools such as QuickSight would be an added advantage Experience of ETL technical design, data quality testing, cleansing and monitoring, data sourcing, and exploration and analysis Data warehousing and data modelling capabilities A good understanding of modern code development practices Experience of working in a governed, and regulatory environment Strong communication skills with the ability to proactively engage and manage a wide range of stakeholders Hours 45 Job Posting Closing Date: 28/07/2025
Posted 2 weeks ago
3.0 - 8.0 years
10 - 14 Lacs
Chennai
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various stakeholders to gather requirements, overseeing the development process, and ensuring that the applications meet the specified needs. You will also engage in problem-solving discussions with your team, providing guidance and support to ensure successful project outcomes. Additionally, you will monitor project progress and make necessary adjustments to keep everything on track, fostering a collaborative environment that encourages innovation and efficiency. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Facilitate knowledge sharing sessions to enhance team capabilities.- Mentor junior team members to support their professional growth. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration and ETL processes.- Experience with cloud computing platforms and services.- Familiarity with data governance and compliance standards.- Ability to work with large datasets and perform data analysis. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
3.0 - 8.0 years
9 - 13 Lacs
Hyderabad
Work from Office
Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Microsoft Power Business Intelligence (BI), Microsoft Azure DatabricksMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the data architecture. You will be involved in analyzing data requirements and translating them into effective solutions that align with the overall data strategy of the organization. Your role will require you to stay updated with the latest trends in data engineering and contribute to the continuous improvement of data processes and systems. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Engage in the design and implementation of data pipelines to support data integration and analytics.- Collaborate with cross-functional teams to gather requirements and translate them into technical specifications. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Good To Have Skills: Experience with Microsoft Azure Databricks, Microsoft Power Business Intelligence (BI).- Strong understanding of data modeling concepts and best practices.- Experience with ETL processes and data warehousing solutions.- Familiarity with cloud-based data solutions and architectures. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
3.0 - 8.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Microsoft Power Business Intelligence (BI), Microsoft Azure DatabricksMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the data architecture. You will be responsible for analyzing data requirements and translating them into effective solutions that support the organization's data strategy. Additionally, you will participate in team meetings to share insights and contribute to the overall success of the data platform initiatives. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Engage in continuous learning to stay updated with the latest trends and technologies in data platforms.- Assist in the documentation of data architecture and design processes to ensure clarity and consistency across the team. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Good To Have Skills: Experience with Microsoft Azure Databricks, Microsoft Power Business Intelligence (BI).- Strong understanding of data integration techniques and best practices.- Experience with data modeling and database design principles.- Familiarity with cloud-based data solutions and architectures. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
15.0 - 20.0 years
5 - 9 Lacs
Mumbai
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Informatica PowerCenter Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities and contribute to the overall success of projects by leveraging your expertise in application development. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica PowerCenter.- Strong understanding of ETL processes and data integration techniques.- Experience with database management systems and SQL.- Familiarity with application development methodologies and best practices.- Ability to troubleshoot and resolve application issues efficiently. Additional Information:- The candidate should have minimum 5 years of experience in Informatica PowerCenter.- This position is based at our Mumbai office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
3.0 - 8.0 years
10 - 14 Lacs
Pune
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various stakeholders to gather requirements, overseeing the development process, and ensuring that the applications meet the specified needs. You will also engage in problem-solving discussions with your team, providing guidance and support to ensure successful project outcomes. Your role will require you to stay updated with the latest technologies and methodologies to enhance application performance and user experience. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Facilitate communication between technical teams and stakeholders to ensure alignment on project goals.- Mentor junior team members, providing them with guidance and support in their professional development. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration and ETL processes.- Experience with cloud computing platforms and services.- Familiarity with data governance and compliance standards.- Ability to analyze and optimize application performance. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
3.0 - 5.0 years
30 - 35 Lacs
Mumbai, Pune
Work from Office
Required - Experience in Big Data Applications and Environments/SQL. Extensive experience in Azure stack - ADLS, Azure SQL DB, Azure Data Factory, Azure Synapse, Analytics Services, Event Hub etc. Experience of working in an Engineering capacity (planning, design, implementation, configuration, upgrades, migrations, troubleshooting and support) of Applications using Azure Stack. Design and build production data pipelines from ingestion to consumption within a big data architecture, using Java, Python, Scala Experience in Coding complex T-SQL, Spark (Python). Experience working in an Engineering capacity (planning, design, implementation, configuration, troubleshooting and support) on ETL Experience in designing and developing Python using Azure Synapse Good experience in Requirements Analysis and Solution Architecture Design, Data modelling, ETL, data integration and data migration design Experience PL/SQL Collaborate with cross-functional teams to define project requirements and technical specifications. Conduct code reviews to ensure adherence to code quality, consistency, and best practices. Accomplished highly motivated and results driven, able to work independently with minimal supervision Ability to think strategically and effectively communicate solutions to various levels of management Preferred- Experience with Big Data Ecosystem: Azure Data Platform, Azure Synapse and related data integration technologies Experience with T-SQL Knowledge and skills (general and technical): Azure Stack, Azure Data Lake, Azure Synapse, Azure Data Factory T-SQL, Spark (Python) Any one or all of the scripting languages PowerShell, Python etc., PL/SQL, MSSQL, Other Databases Excellent communication skills Bachelor s or Master s degree in Computer Science, Software Engineering, or a related field (B.E./B.Tech, MCA/M.Sc or equivalent ).
Posted 2 weeks ago
6.0 - 11.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Req ID: 332631 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Sr Java Dev to join our team in Bangalore, Karn taka (IN-KA), India (IN). Senior Application Developer - Java Who we are: NTT DATA America strives to hire exceptional, innovative and passionate individuals who want to grow with us. Launch by NTT DATA is the culmination of the company s strategy to acquire and integrate the skills, experience, and technology of leading digital companies, backed by NTT DATA s core capabilities, global reach, and depth. How You ll Help Us: Our clients need digital solutions that will transform their business so they can succeed in today s hypercompetitive marketplace. As a team member you will routinely deliver elite solutions to clients that will impact their products, customers, and services. Using your development, design and leadership skills and experience, you will design and implement solutions based on client needs. You will collaborate with customers on future system enhancements, thus resulting to continued engagements. How We Will Help You: Joining our Java practice is not only a job, but a chance to grow your career. We will make sure to equip you with the skills you need to produce robust applications that you can be proud of. Whether it is providing you with training on a new programming language or helping you get certified in a new technology, we will help you grow your skills so you can continue to deliver increasingly valuable work. Once You Are Here, You Will: The Senior Applications Developer provides input and support for, and performs full systems life cycle management activities (e.g., analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc.). You will participate in component and data architecture design, technology planning, and testing for Applications Development (AD) initiatives to meet business requirements. This position provides input to applications development project plans and integrations. You will collaborate with teams and supports emerging technologies to ensure effective communication and achievement of objectives. The Senior Applications Developer provides knowledge and support for applications development, integration, and maintenance. You will provide input to department and project teams on decisions supporting projects. Apply Disaster Recovery Knowledge Apply Information Analysis and Solution Generation Knowledge Apply Information Systems Knowledge Apply Internal Systems Knowledge IT - Design/Develop Application Solutions IT - Knowledge of Emerging Technology IT - Problem Management/Planning Technical Problem Solving and Analytical Processes Technical Writing Job Requirements: Contribute to IS Projects; Conducts systems and requirements analyses to identify project action items. Perform Analysis and Design; participates in defining and developing technical specifications to meet systems requirements. Design and Develop Moderate to Highly Complex Applications; Analyzes, designs, codes, tests, corrects, and documents moderate to highly complex programs to ensure optimal performance and compliance. Develop Application Documentation; Develops and maintains system documentation to ensure accuracy and consistency. Produce Integration Builds; Defines and produces integration builds to create applications. Performs Maintenance and Support; Defines and administers procedures to monitor systems performance and integrity. Support Emerging Technologies and Products; Monitors the industry to gain knowledge and understanding of emerging technologies. Must have GCP and Big Query experience Should have Power BI, Microservice Architecture, SQL Server, DB2, Spring Boot, JSON, Java, C#, AMQP, AzureAD, HTTP, readme documentation. Should be proficient in GIT, Scrum, and Azure DevOps Basic qualifications: 6+ years of experience with Java, including building complex, scalable applications. 6+ years of experience in Spring Boot, including designing and implementing advanced microservices architectures. 4+ years of GCP and Big Query experience Ideal Mindset: Lifelong Learner. You are always seeking to improve your technical and nontechnical skills. Team Player. You are someone who wants to see everyone on the team succeed and is willing to go the extra mile to help a teammate in need. Communicator. You know how to communicate your design ideas to both technical and nontechnical stakeholders, prioritizing critical information and leaving out extraneous details. Include if in India: Please note Shift Timing Requirement: 1:30pm IST -10:30 pm IST #Launchjobs #LaunchEngineering
Posted 2 weeks ago
6.0 - 11.0 years
9 - 10 Lacs
Bengaluru
Work from Office
Req ID: 322582 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Senior .NET Developer - Remote to join our team in Bangalore, Karn taka (IN-KA), India (IN). Senior .NET Developer - Remote Who We Are: NTT DATA America strives to hire exceptional, innovative and passionate individuals who want to grow with us. Launch by NTT DATA is the culmination of the company s strategy to acquire and integrate the skills, experience, and technology of leading digital companies, backed by NTT DATA s core capabilities, global reach, and depth. How You ll Help Us: A Senior Application Developer is first and foremost a software developer who specializes in .NET C# development. You ll be part of a team focused on delivering quality software for our clients. How We Will Help You: Joining our Microsoft practice is not only a job, but a chance to grow your career. We will make sure to equip you with the skills you need to produce robust applications that you can be proud of. Whether it is providing you with training on a new programming language or helping you get certified in a new technology, we will help you grow your skills so you can continue to deliver increasingly valuable work. Once You Are Here, You Will: The Senior Applications Developer provides input and support for, and performs full systems life cycle management activities (e.g., analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc.). You will participate in component and data architecture design, technology planning, and testing for Applications Development (AD) initiatives to meet business requirements. This position provides input to applications development project plans and integrations. Additionally, you will collaborate with teams and support emerging technologies to ensure effective communication and achievement of objectives. The Senior Applications Developer provides knowledge / support for applications development, integration, and maintenance as well as providing input to department and project teams on decisions supporting projects. Basic Qualifications: 6+ years developing in .Net/.Net Core 3+ years of experience with Object Oriented Programming and SOLID Principles 3+ years of Rest API development 2+ years of experience working with Databases and writing stored procedures 2+ year of unit and service testing with frameworks such as xunit, Nunit, etc. 1+ year of cloud platform experience either in AWS, Azure, or GCP Preferred: Experience with CI/CD tooling i.e. Jenkins, Azure Devops etc Experience with containerization technologies e.g. Docker, Kubernetes GCP experience Ideal Mindset: Lifelong Learner: You are always seeking to improve your technical and nontechnical skills. Team Player: You are someone who wants to see everyone on the team succeed and is willing to go the extra mile to help a teammate in need. Communicator: You know how to communicate your design ideas to both technical and nontechnical stakeholders, prioritizing critical information and leaving out extraneous details. Please note Shift Timing Requirement: 1:30pm IST -10:30 pm IST #Launchjobs #LaunchEngineering
Posted 2 weeks ago
8.0 - 10.0 years
30 - 37 Lacs
Chennai
Work from Office
We are seeking a highly experienced and strategic Director of Enterprise Architecture to lead our Enterprise Architecture function based in India. This senior leadership role is critical in shaping the technology future for Ford Credits global operations. You will be responsible for leading and mentoring a talented team of architects in India, while collaborating closely with Enterprise Architecture Directors and teams located in North America and Europe. The successful candidate will possess a strong blend of deep technical expertise, strategic vision, and exceptional relationship management skills. You will drive the development and evolution of architectural strategies, standards, and roadmaps that support our global business needs across a wide range of domains, including digital platforms (web, mobile), enterprise integrations, data management, risk systems, AI/ML capabilities, lending/banking platforms, and customer service solutions. This role requires someone who can not only define the what and how from an architectural perspective but also effectively communicate the why to stakeholders at various levels across the organization. Required: Bachelors degree in Computer Science, Engineering, Information Technology, or a related technical field, or equivalent practical experience. Significant experience (typically 10+ years) in Enterprise Architecture or senior-level Solution Architecture roles within large, complex organizations. Proven experience leading and managing technical teams, preferably architecture teams (typically 5+ years of management experience). Demonstrated ability to define and implement enterprise-level architectural strategies, standards, and roadmaps. Deep understanding of various architectural patterns (e.g., Microservices, Event-Driven Architecture, Service-Oriented Architecture) and design principles. Experience with designing and overseeing the implementation of large-scale, distributed, and global enterprise systems. Strong working knowledge across several relevant technology domains such as: cloud platforms (GCP, Azure, AWS), data architecture (data lakes, data warehousing, data governance), integration patterns (APIs, messaging, ETL), digital platforms (web, mobile), AI/ML architecture, and security architecture principles. Excellent verbal and written communication, presentation, and interpersonal skills, with the ability to influence and build consensus among diverse stakeholders. Ability to operate effectively in a global, matrixed organization. Preferred: Masters degree in a relevant field. Experience in the Financial Services or Automotive Finance industry. Experience working with teams and stakeholders located in different geographic regions (e.g., North America, Europe). Familiarity with architectural frameworks (e.g., TOGAF, Zachman). Experience with agile development methodologies. Skills Exceptional strategic thinking and the ability to translate business strategy into technical architecture. Strong leadership and team-building capabilities. Superior stakeholder management, negotiation, and influencing skills. Deep technical acumen across a broad range of technologies and architectural domains. Excellent analytical and problem-solving skills. Ability to manage multiple priorities and navigate ambiguity in a dynamic environment. A passion for technology, innovation, and continuous improvement. Architectural Leadership & Strategy: Lead the development, communication, and governance of enterprise architectural strategies, principles, standards, and roadmaps for Ford Credit globally, with a focus on alignment across regions (India, NA, Europe). Provide strategic guidance and oversight for the architectural design and implementation of complex, large-scale solutions supporting global business needs across diverse domains (Digital, Data, AI, Lending, Banking, Risk, Customer Service, Integrations). Ensure architectural decisions support business objectives, foster innovation, improve efficiency, and manage technical debt. Champion architectural best practices, patterns, and methodologies within the team and across the broader IT organization. Team Leadership & Development: Lead, mentor, and develop a high-performing team of Enterprise Architects in India. Foster a collaborative and innovative team environment, promoting continuous learning and growth. Manage team priorities, resources, and performance to deliver high-quality architectural outcomes. Global Collaboration & Stakeholder Management: Collaborate closely with Enterprise Architecture Directors and teams in North America and Europe to ensure global consistency, leverage shared capabilities, and contribute to a unified global EA function. Build and maintain strong relationships with senior business leaders, IT executives, product managers, engineering teams, and other key stakeholders globally. Effectively communicate complex architectural concepts and strategies to both technical and non-technical audiences. Influence decision-making and drive consensus on architectural direction across organizational boundaries. Architectural Governance & Quality: Establish and refine architectural governance processes to ensure solutions adhere to defined standards and strategies. Provide architectural reviews and guidance for major projects and initiatives. Identify and mitigate architectural risks. Technology & Market Awareness: Stay abreast of industry trends, emerging technologies, and competitive landscapes relevant to financial services, automotive finance, and the specific technology domains (Cloud, AI, Data, Digital, etc.). Evaluate new technologies and assess their potential impact and applicability to Ford Credits global architecture.
Posted 2 weeks ago
5.0 - 11.0 years
50 - 100 Lacs
Bengaluru
Work from Office
. Roles and Responsibility Spark/Scala Job Description As a Software Development Engineer 2 you will be responsible for expanding and optimising our data and data pipeline architecture as well as optimising data flow and collection for cross-functional teams. The ideal candidate is an experienced data pipeline design and data wrangler who enjoys optimising data systems and building them from the ground up. The Data Engineer will lead our software developers on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams systems and products. The right candidate will be excited by the prospect of optimising or even re-designing our company s data architecture to support our next generation of products and data initiatives. Responsibilities Create and maintain optimal data pipeline architecture Assemble large complex data sets that meet functional / non-functional business requirements. Identify design and implement internal process improvements: automating manual processes optimising data delivery, coordinating to re-design infrastructure for greater scalability etc. Work with stakeholders including the Executive Product Data and Design teams to assist with data-related technical issues and support their data infrastructure needs. Keep our data separated and secure Work with data and analytics experts to strive for greater functionality in our data systems. - Support PROD systems Qualifications Must have About 5 - 11 years and at least 3 years relevant experience with Bigdata. Must have Experience in building highly scalable business applications, which involve implementing large complex business flows and dealing with huge amount of data. Must have experience in Hadoop, Hive, Spark with Scala with good experience in performance tuning and debugging issues. Good to have any stream processing Spark/Java Kafka. Must have experience in design and development of Big data projects. Good knowledge in Functional programming and OOP concepts, SOLID principles, design patterns for developing scalable applications. Familiarity with build tools like Maven. Must have experience with any RDBMS and at least one NoSQL database preferably PostgresSQL Must have experience writing unit and integration tests using scaliest Must have experience using any versioning control system - Git Must have experience with CI / CD pipeline - Jenkins is a plus Basic hands-on experience in one of the cloud provider (AWS/Azure) is a plus Databricks Spark certification is a plus.
Posted 2 weeks ago
8.0 - 13.0 years
14 - 18 Lacs
Hyderabad
Work from Office
We are seeking a strategic and technically strong Enterprise Data Architect to design and lead the implementation of scalable, secure, and high-performing data architecture solutions across the organization. The ideal candidate will have deep experience with modern data platforms, including Snowflake, DBT, SnapLogic, and cloud-native technologies. This role requires a balance of technical expertise, architectural vision, and business acumen to align data solutions with enterprise goals. Key Responsibilities: Define and maintain the organizations enterprise data architecture strategy, including data modeling, governance, and integration standards. Lead the design and architecture of enterprise-grade data platforms using Snowflake, DBT, SnapLogic, and Azure Data Factory. Oversee the development of robust, scalable, and secure data pipelines across a hybrid cloud environment. Architect and optimize SQL Server and PostgreSQL environments to ensure availability, performance, and scalability. Define and enforce integration patterns to ensure data consistency, accuracy, and reliability across systems. Guide the design of efficient ETL/ELT frameworks to ensure alignment with data warehousing and business intelligence requirements. Partner with business and technical teams, including data engineers, analysts, and stakeholders, to define and enforce data governance and metadata management practices. Review and guide SQL query performance tuning, indexing strategies, and system monitoring. Provide direction on the use of Python for data automation, orchestration, and advanced transformations. Establish and maintain enterprise-wide documentation for data flows, data dictionaries, and architectural decisions. Technical Skills & Experience: 8+ years of progressive experience in data engineering or architecture roles, with 2-3 years in a lead or architect capacity. Proven experience designing and implementing data architectures using Snowflake , DBT , SnapLogic , and Azure Data Factory . Strong proficiency in SQL and performance tuning across large-scale environments. Deep experience with SQL Server and PostgreSQL administration and architecture. Experience with Python for scripting, data processing, and orchestration tasks. Solid understanding of data governance , security , compliance , and data lifecycle management . Experience leading data modernization initiatives in cloud/hybrid environments. Understanding of metadata management, master data management, and data lineage tools is a plus. Soft Skills: Strategic mindset with excellent analytical and problem-solving skills. Strong leadership and communication abilities, capable of influencing stakeholders across business and technical domains. Ability to translate business requirements into scalable and sustainable technical solutions. Team-oriented with a collaborative approach to cross-functional projects. Preferred Qualifications: Bachelor s or master s degree in computer science, Data Engineering, or a related field. Relevant certifications (e.g., Snowflake Architect , Azure Solutions Architect , DBT Certification ) are highly desirable.
Posted 2 weeks ago
5.0 - 10.0 years
3 - 7 Lacs
Hyderabad
Work from Office
We are seeking an experienced Data Modeler with a strong background in real estate, investment management, and master data management. The ideal candidate will be responsible for designing, implementing, and maintaining data models that support our business objectives. This role requires a deep understanding of data architecture, data integration, and database optimization. Key Responsibilities: Design and Develop Data Models: Create conceptual, logical, and physical data models to support business requirements in the real estate and investment management domains. Master Data Management (MDM): Develop and manage master data solutions to ensure data consistency, accuracy, and reliability across the organization. Data Integration: Integrate data from various sources, ensuring consistency and accuracy across systems. Data Mapping: Map data elements to business requirements and create detailed data mapping documents. Collaboration: Work closely with data analysts, database administrators, and business stakeholders to understand data needs and deliver solutions. Documentation: Maintain comprehensive documentation of data models, data flows, and data dictionaries. Data Governance: Ensure data models comply with data governance and security policies. Qualifications: Experience: Overall 12+ Yrs Minimum of 5 years of experience in data modeling, with a focus on real estate, investment management, and master data management. Technical Skills: Proficiency in SQL, data modeling tools (e.g., ER/Studio, ERwin), and database management systems (e.g., Oracle, SQL Server). Domain Expertise: In-depth knowledge of real estate and investment management processes and data requirements. MDM Expertise: Strong experience in master data management, including data governance, data quality, and data stewardship. Analytical Skills: Strong analytical and problem-solving skills. Communication: Excellent verbal and written communication skills. Preferred Skills: Experience with data warehousing and business intelligence tools. Familiarity with cloud-based data solutions (e.g., AWS, Azure). Knowledge of data governance frameworks and best practices.
Posted 2 weeks ago
2.0 - 7.0 years
4 - 7 Lacs
Hyderabad
Work from Office
Job_Description":" Data Engineer Position Overview Role Summary We are searching for a talented and motivated Data Engineerto join our team. The ideal candidate will have expertise in data modeling,analytical thinking, and developing ETL processes using Python. In this role,you will be pivotal in transforming raw data from landing tables into reliable,curated master tables, ensuring accuracy, accessibility, and integrity withinour Snowflake data platform. Main Responsibilities Design, Develop, and Maintain ETL Processes: Build and maintain scalable ETL pipelines inPython to extract, transform, and load data into Snowflake master tables.Automate data mastering, manage incremental updates, and ensure consistencybetween landing and master tables. Data Modeling: Create and optimize logical and physical datamodels in Snowflake for efficient querying and reporting. Translate businessneeds into well-structured data models, defining tables, keys, relationships,and constraints. Analytical Thinking and Problem Solving: Analyze complex datasets, identify trends, andwork with analysts and stakeholders to resolve data challenges. Investigatedata quality issues and design robust solutions aligned with business goals. Data Quality and Governance: Implement routines for data validation,cleansing, and error handling to ensure accuracy and reliability in Snowflake.Support the creation and application of data governance standards. Automation and Optimization: Seek automation opportunities for dataengineering tasks, enhance ETL processes for performance, and scale systems asdata volumes grow within Snowflake. Documentation and Communication: Maintain thorough documentation of data flows,models, transformation logic, and pipeline configurations. Clearly communicatetechnical concepts to all stakeholders. Collaboration: Work closely with data scientists, analysts, andengineers to deliver integrated data solutions, contributing tocross-functional projects with your data engineering expertise. Required Qualifications Bachelors or Masters degree in ComputerScience, IT, Engineering, Mathematics, or related field At least 2 years of experience as a DataEngineer or similar role Strong Python skills, including experiencedeveloping ETL pipelines and automation scripts Solid understanding of relational anddimensional data modeling Experience with Snowflake for SQL, schemadesign, and managing pipelines Proficient in SQL for querying and data analysisin Snowflake Strong analytical and problem-solving skills Familiarity with data warehousing and bestpractices Knowledge of data quality, cleansing, andvalidation techniques Experience with version control systems like Gitand collaborative workflows Excellent communication, both verbal and written Preferred Qualifications In-depth knowledge of Snowflake features likeSnowpipe, Streams, Tasks, and Time Travel Experience with cloud platforms such as AWS,Azure, or Google Cloud Familiarity with workflow orchestration toolslike Apache Airflow or Luigi Understanding of big data tools like Spark,Hadoop, or distributed databases Experience with CI/CD pipelines in dataengineering Background in streaming data and real-timeprocessing Experience deploying data pipelines inproduction Sample Responsibilities in Practice Develop automated ETL pipelines in Python toingest daily CSVs into a Snowflake landing table, validate data, and mergeclean records into a master table, handling duplicates and change tracking. Design scalable data models in Snowflake tosupport business intelligence reporting, ensuring both integrity and queryperformance. Collaborate with business analysts to adapt datamodels and pipelines to evolving needs. Monitor pipeline performance and troubleshootinconsistencies, documenting causes and solutions. Key Skills and Competencies Technical Skills: Python (including pandas,SQLAlchemy); Snowflake SQL and management; schema design; ETL processdevelopment Analytical Thinking: Ability to translatebusiness requirements into technical solutions; strong troubleshooting skills Collaboration and Communication: Effective teamplayer; clear technical documentation Adaptability: Willingness to adopt newtechnologies and proactively improve processes Our Data Environment Our organization manages diverse data sources,including transactional systems, third-party APIs, and unstructured data. Weare dedicated to building a top-tier Snowflake data infrastructure foranalytics, reporting, and machine learning. In this role, you will influenceour data architecture, implement modern data engineering practices, andcontribute to a culture driven by data. ","
Posted 2 weeks ago
5.0 - 8.0 years
6 - 10 Lacs
Telangana
Work from Office
Key Responsibilities: Team Leadership: Lead and mentor a team of Azure Data Engineers, providing technical guidance and support. Foster a collaborative and innovative team environment. Conduct regular performance reviews and set development goals for team members. Organize training sessions to enhance team skills and technical capabilities. Azure Data Platform: Design, implement, and optimize scalable data solutions using Azure data services such as Azure Databricks, Azure Data Factory, Azure SQL Database, and Azure Synapse Analytics. Ensure data engineering best practices and data governance are followed. Stay up-to-date with Azure data technologies and recommend improvements to enhance data processing capabilities. Data Architecture: Collaborate with data architects to design efficient and scalable data architectures. Define data modeling standards and ensure data integrity, security, and governance compliance. Project Management: Work with project managers to define project scope, goals, and deliverables. Develop project timelines, allocate resources, and track progress. Identify and mitigate risks to ensure successful project delivery. Collaboration & Communication: Collaborate with cross-functional teams including data scientists, analysts, and business stakeholders to deliver data-driven solutions. Communicate effectively with stakeholders to understand requirements and provide updates. Qualifications: Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. Proven experience as a Team Lead or Manager in data engineering. Extensive experience with Azure data services and cloud technologies. Expertise in Azure Databricks, PySpark, and SQL. Strong understanding of data engineering best practices, data modeling, and ETL processes. Experience with agile development methodologies. Certifications in Azure data services (preferred). Preferred Skills: Experience with big data technologies and data warehousing solutions. Familiarity with industry standards and compliance requirements. Ability to lead and mentor a team.
Posted 2 weeks ago
2.0 - 7.0 years
22 - 27 Lacs
Bengaluru
Work from Office
Amazon strives to be Earths most customer-centric company where people can find and discover virtually anything they want to buy online. By giving customers more of what they want low prices, vast selection, and convenience Amazon continues to grow and evolve as a world-class e-commerce platform. Do you have solid analytical thinking, metrics-driven decision making and want to solve problems with solutions that will meet the growing worldwide need? Then SmartCommerce is the team for you. We are looking for top notch Business Intelligence Engineer to be part of our analytics team. The ideal candidate will be curious, have attention to detail, be energized by challenging entrepreneurial environment, be comfortable thinking big while also diving deep. Are you a smart, hungry, flexible, and world-class analytics professional excited by the challenge of launching a new business initiative for Amazon? SmartCommerce team is looking for Business Intelligence Engineer to be part of a new team being built from the ground up. They will be primarily working on our product SmartBiz. SmartBiz by Amazon is a one-stop shop for Indian sellers to fulfill their online selling needs. Whether a small business, an entrepreneur, or a neighborhood store, a seller can now create their own e-commerce store within minutes and start showcasing and selling their products online. 1.Responsible for designing, building and maintaining complex data solutions for Amazons SmartCommerce businesses 2.Actively participates in the code review process, design discussions, team planning, operational excellence, and constructively identifies problems and proposes solutions 3.Makes appropriate trade-offs, re-use where possible, and is judicious about introducing dependencies 4.Makes efficient use of resources (e.g., system hardware, data storage, query optimization, AWS infrastructure etc.) 5.Asks correct questions when data model and requirements are not well defined and comes up with designs which are scalable, maintainable and efficient 6.Makes enhancements that improve team s data architecture, making it better and easier to maintain (e.g., data auditing solutions, automating, ad-hoc or manual operation steps) 7.Owns the data quality of important datasets and any new changes/enhancements 2+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience Experience with data visualization using Tableau, Quicksight, or similar tools Experience with one or more industry analytics visualization tools (e.g. Excel, Tableau, QuickSight, MicroStrategy, PowerBI) and statistical methods (e.g. t-test, Chi-squared) Experience with scripting language (e.g., Python, Java, or R) Masters degree, or Advanced technical degree Knowledge of data modeling and data pipeline design Experience with statistical analysis, co-relation analysis
Posted 2 weeks ago
9.0 - 12.0 years
15 - 20 Lacs
Chennai
Work from Office
Job Title:Data Engineer Lead / Architect (ADF)Experience9-12YearsLocation:Remote / Hybrid : Role and ResponsibilitiesTalk to client stakeholders, and understand the requirements for building their data warehouse / data lake / data Lakehouse. Design, develop and maintain data pipelines in Azure Data Factory (ADF) for ETL from on-premise and cloud based sources Design, develop and maintain data warehouses and data lakes in Azure Run large data platform and other related programs to provide business intelligence support Design and Develop data models to support business intelligence solutions Implement best practices in data modelling and data warehousing Troubleshoot and resolve issues related to ETL and data connections Skills Required: Excellent written and verbal communication skills Excellent knowledge and experience in ADF Well versed with ADLS Gen 2 Knowledge of SQL for data extraction and transformation Ability to work with various data sources (Excel, SQL databases, APIs, etc.) Knowledge in SAS would be added advantage Knowledge in Power BI would be added advantage
Posted 2 weeks ago
12.0 - 22.0 years
40 - 60 Lacs
Bengaluru
Work from Office
Location: Bangalore Onsite Experience: 12+ years Type: Full-time --- Role Overview We are looking for a Technical Program Manager (TPM) to drive the execution of a next-generation data and AI platform that powers real-time analytics, machine learning, and industrial applications across multiple domains such as aviation, logistics, and manufacturing. You will work at the intersection of engineering, product, architecture, and business, managing the roadmap, resolving technical dependencies, and ensuring delivery of critical platform components across cross-functional and geographically distributed teams. --- Key Responsibilities Program & Execution Management Drive end-to-end delivery of platform features and sector-specific solutions by coordinating multiple scrum teams (AI/ML, Data, Fullstack, DevOps). Develop and maintain technical delivery plans, sprint milestones, and program-wide timelines. Identify and resolve cross-team dependencies, risks, and technical bottlenecks. Technical Fluency & Architecture Alignment Understand the platform’s architecture (Kafka, Spark, data lakes, ML pipelines, hybrid/on-prem deployments) and guide teams toward cohesive delivery. Translate high-level product goals into detailed technical milestones and backlog items in collaboration with Product Owners and Architects. Cross-Functional Collaboration Liaise between globally distributed engineering teams, product owners, architects, and domain stakeholders to align on priorities and timelines. Coordinate multi-sector requirements and build scalable components that serve as blueprints across industries (aviation, logistics, etc.). Governance & Reporting Maintain clear, concise, and timely program reporting (dashboards, OKRs, status updates) for leadership and stakeholders. Champion delivery best practices, quality assurance, and documentation hygiene. Innovation & Agility Support iterative product development with flexibility to handle ambiguity and evolving priorities. Enable POCs and rapid prototyping efforts while planning for scalable production transitions. ---Required Skills & Qualifications 12+ years of experience in software engineering and technical program/project management. Strong understanding of platform/data architecture, including event streaming (Kafka), batch/stream processing (Spark, Flink), and AI/ML pipelines. Proven success delivering complex programs in agile environments with multiple engineering teams. Familiarity with DevOps, cloud/on-prem infrastructure (AWS, Azure, hybrid models), CI/CD, and observability practices. Excellent communication, stakeholder management, and risk mitigation skills. Strong grasp of Agile/Scrum or SAFe methodologies. --- Good-to-Have Experience working in or delivering solutions to industrial sectors such as aviation, manufacturing, logistics, or utilities. Experience with tools like Jira, Confluence, Notion, Asana, or similar. Background in engineering or data (Computer Science, Data Engineering, AI/ML, or related).
Posted 2 weeks ago
12.0 - 15.0 years
13 - 18 Lacs
Gurugram
Work from Office
Project Role : Data Architect Project Role Description : Define the data requirements and structure for the application. Model and design the application data structure, storage and integration. Must have skills : SAP Data Services Development Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Architect, you will define the data requirements and structure for the application. Your typical day will involve modeling and designing the application data structure, storage, and integration, ensuring that the data architecture aligns with the overall business objectives and technical specifications. You will collaborate with various stakeholders to gather requirements and translate them into effective data solutions, while also addressing any challenges that arise during the design and implementation phases. Your role will be pivotal in ensuring that the data architecture is robust, scalable, and capable of supporting future growth and innovation within the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate knowledge sharing and best practices among team members.- Monitor and evaluate the effectiveness of data solutions and make necessary adjustments. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP Data Migration.- Experience with SAP Data & Development.- Strong understanding of data modeling techniques and best practices.- Familiarity with data integration tools and methodologies.- Ability to design and implement data governance frameworks. Additional Information:- The candidate should have minimum 12 years of experience in SAP Data Migration.- This position is based at our Gurugram office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
12.0 - 15.0 years
15 - 20 Lacs
Bengaluru
Work from Office
Project Role : Solution Architect Project Role Description : Translate client requirements into differentiated, deliverable solutions using in-depth knowledge of a technology, function, or platform. Collaborate with the Sales Pursuit and Delivery Teams to develop a winnable and deliverable solution that underpins the client value proposition and business case. Must have skills : Solution Architecture Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Solution Architect, you will engage in a dynamic and collaborative environment where you will translate client requirements into innovative and effective solutions. Your typical day will involve working closely with various teams to ensure that the solutions developed are not only deliverable but also align with the client's business objectives. You will leverage your extensive knowledge of technology and platforms to create value propositions that resonate with clients, ensuring that their needs are met with precision and creativity. This role requires a proactive approach to problem-solving and a commitment to delivering high-quality outcomes that drive client satisfaction and business success. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate workshops and discussions to gather requirements and feedback from stakeholders.- Mentor junior team members to enhance their skills and knowledge in solution architecture. Professional & Technical Skills: - Must To Have Skills: Proficiency in Solution Architecture.- Strong understanding of cloud computing platforms and services.- Experience with enterprise application integration and API management.- Ability to design scalable and resilient architectures.- Familiarity with agile methodologies and project management practices. Additional Information:- The candidate should have minimum 12 years of experience in Solution Architecture.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
8.0 - 13.0 years
8 - 13 Lacs
Telangana
Work from Office
Key Responsibilities: Team Leadership: Lead and mentor a team of Azure Data Engineers, providing technical guidance and support. Foster a collaborative and innovative team environment. Conduct regular performance reviews and set development goals for team members. Organize training sessions to enhance team skills and technical capabilities. Azure Data Platform: Design, implement, and optimize scalable data solutions using Azure data services such as Azure Databricks, Azure Data Factory, Azure SQL Database, and Azure Synapse Analytics. Ensure data engineering best practices and data governance are followed. Stay up-to-date with Azure data technologies and recommend improvements to enhance data processing capabilities. Data Architecture: Collaborate with data architects to design efficient and scalable data architectures. Define data modeling standards and ensure data integrity, security, and governance compliance. Project Management: Work with project managers to define project scope, goals, and deliverables. Develop project timelines, allocate resources, and track progress. Identify and mitigate risks to ensure successful project delivery. Collaboration & Communication: Collaborate with cross-functional teams including data scientists, analysts, and business stakeholders to deliver data-driven solutions. Communicate effectively with stakeholders to understand requirements and provide updates. Qualifications: Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. Proven experience as a Team Lead or Manager in data engineering. Extensive experience with Azure data services and cloud technologies. Expertise in Azure Databricks, PySpark, and SQL. Strong understanding of data engineering best practices, data modeling, and ETL processes. Experience with agile development methodologies. Certifications in Azure data services (preferred). Preferred Skills: Experience with big data technologies and data warehousing solutions. Familiarity with industry standards and compliance requirements. Ability to lead and mentor a team.
Posted 2 weeks ago
7.0 - 10.0 years
20 - 35 Lacs
Noida
Remote
Position: Cloud Data Architect Location Remote Work Time: US EST Hours Job Description: ECC/BW/HANA Solution/Data Architect Design and implement end-to-end SAP ECC, BW, and HANA data architectures, ensuring scalable and robust solutions. Develop and optimize data models, ETL processes, and reporting frameworks across SAP landscapes. Lead integration efforts, defining and applying best practices for connecting SAP systems with external platforms and cloud services. Collaborate with business stakeholders to translate requirements into technical solutions, focusing on data quality and governance. Provide technical leadership and mentorship to project teams, ensuring alignment with enterprise integration patterns and standards. Interested candidate can apply : dsingh15@fcsltd.com
Posted 2 weeks ago
1.0 - 7.0 years
3 - 9 Lacs
Bengaluru
Work from Office
Design, develop, and implement machine learning models and statistical algorithms.Analyze large datasets to extract meaningful insights and trends.Collaborate with stakeholders to define business problems and deliver data-driven solutions.Optimize and scale machine learning models for production environments.Present analytical findings and recommendations in a clear, actionable manner.Key Skills:Proficiency in Python, R, and SQL.Experience with ML libraries like TensorFlow, PyTorch, or Scikit-learn.Strong knowledge of statistical methods and data visualization tools.Excellent problem-solving and storytelling skills
Posted 2 weeks ago
8.0 - 12.0 years
30 - 35 Lacs
Hyderabad
Work from Office
Job Summary We are seeking an experienced Data Architect with expertise in Snowflake, dbt, Apache Airflow, and AWS to design, implement, and optimize scalable data solutions. The ideal candidate will play a critical role in defining data architecture, governance, and best practices while collaborating with cross-functional teams to drive data-driven decision-making. Key Responsibilities Data Architecture & Strategy: Design and implement scalable, high-performance cloud-based data architectures on AWS. Define data modelling standards for structured and semi-structured data in Snowflake. Establish data governance, security, and compliance best practices. Data Warehousing & ETL/ELT Pipelines: Develop, maintain, and optimize Snowflake-based data warehouses. Implement dbt (Data Build Tool) for data transformation and modelling. Design and schedule data pipelines using Apache Airflow for orchestration. Cloud & Infrastructure Management: Architect and optimize data pipelines using AWS services like S3, Glue, Lambda, and Redshift. Ensure cost-effective, highly available, and scalable cloud data solutions. Collaboration & Leadership: Work closely with data engineers, analysts, and business stakeholders to align data solutions with business goals. Provide technical guidance and mentoring to the data engineering team. Performance Optimization & Monitoring: Optimize query performance and data processing within Snowflake. Implement logging, monitoring, and alerting for pipeline reliability. Required Skills & Qualifications 10+ years of experience in data architecture, engineering, or related roles. Strong expertise in Snowflake, including data modeling, performance tuning, and security best practices. Hands-on experience with dbt for data transformations and modeling. Proficiency in Apache Airflow for workflow orchestration. Strong knowledge of AWS services (S3, Glue, Lambda, Redshift, IAM, EC2, etc.). Experience with SQL, Python, or Spark for data processing. Familiarity with CI/CD pipelines, Infrastructure-as-Code (Terraform/CloudFormation) is a plus. Strong understanding of data governance, security, and compliance (GDPR, HIPAA, etc.). Preferred Qualifications Certifications: AWS Certified Data Analytics - Specialty, Snowflake SnowPro Certification, or dbt Certification. Experience with streaming technologies (Kafka, Kinesis) is a plus. Knowledge of modern data stack tools (Looker, Power BI, etc.). Experience in OTT streaming could be added advantage.
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France